Table of Contents Author Guidelines Submit a Manuscript
Mathematical Problems in Engineering
Volume 2015 (2015), Article ID 241436, 18 pages
http://dx.doi.org/10.1155/2015/241436
Research Article

Enhancing Both Efficiency and Representational Capability of Isomap by Extensive Landmark Selection

School of Mathematics and Statistics and Institute for Information and System Science, Xi’an Jiaotong University, Xi’an 710049, China

Received 24 November 2014; Accepted 20 February 2015

Academic Editor: Wanquan Liu

Copyright © 2015 Dong Liang et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Linked References

  1. J. B. Tenenbaum, V. de Silva, and J. C. Langford, “A global geometric framework for nonlinear dimensionality reduction,” Science, vol. 290, no. 5500, pp. 2319–2323, 2000. View at Publisher · View at Google Scholar · View at Scopus
  2. S. T. Roweis and L. K. Saul, “Nonlinear dimensionality reduction by locally linear embedding,” Science, vol. 290, no. 5500, pp. 2323–2326, 2000. View at Publisher · View at Google Scholar · View at Scopus
  3. G. E. Hinton and R. R. Salakhutdinov, “Reducing the dimensionality of data with neural networks,” Science, vol. 313, no. 5786, pp. 504–507, 2006. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  4. Y. Leung, D. Meng, and Z. Xu, “Evaluation of a spatial relationship by the concept of intrinsic spatial distance,” Geographical Analysis, vol. 45, no. 4, pp. 380–400, 2013. View at Google Scholar · View at Scopus
  5. Z. Han, D. Y. Meng, Z. B. Xu, and N. N. Gu, “Incremental alignment manifold learning,” Journal of Computer Science and Technology, vol. 26, no. 1, pp. 153–165, 2010. View at Publisher · View at Google Scholar · View at Scopus
  6. D. Y. Meng, Y. Leung, Z. B. Xu, T. Fung, and Q. F. Zhang, “Improving geodesic distance estimation based on locally linear assumption,” Pattern Recognition Letters, vol. 29, no. 7, pp. 862–870, 2008. View at Publisher · View at Google Scholar · View at Scopus
  7. D. Y. Meng, Y. Leung, and Z. B. Xu, “Detecting intrinsic loops underlying data manifold,” IEEE Transactions on Knowledge and Data Engineering, vol. 25, no. 2, pp. 337–347, 2013. View at Publisher · View at Google Scholar · View at Scopus
  8. L. K. Saul and S. T. Roweis, “Think globally, fit locally: unsupervised learning of low dimensional manifolds,” Journal of Machine Learning Research, vol. 4, no. 2, pp. 119–155, 2004. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  9. V. de Silva and J. B. Tenenbaum, “Global versus local methods in nonlinear dimensionality reductionction,” in Proceedings of the Neural Information Processing Systems (NIPS '02), vol. 15, pp. 705–712, 2002.
  10. J. A. Lee, A. Lendasse, and M. Verleysen, “Nonlinear projection with curvilinear distances: isomap versus curvilinear distance analysis,” Neurocomputing, vol. 57, no. 1–4, pp. 49–76, 2004. View at Publisher · View at Google Scholar · View at Scopus
  11. M. Belkin and P. Niyogi, “Laplacian eigenmaps for dimensionality reduction and data representation,” Neural Computation, vol. 15, no. 6, pp. 1373–1396, 2003. View at Publisher · View at Google Scholar · View at Scopus
  12. D. Y. Meng, Y. Leung, and Z. B. Xu, “Evaluating nonlinear dimensionality reduction based on its local and global quality assessments,” Neurocomputing, vol. 74, no. 6, pp. 941–948, 2011. View at Google Scholar
  13. Z. Y. Zhang and H. Y. Zha, “Principal manifolds and nonlinear dimensionality reduction via tangent space alignment,” Journal of Shanghai University. English Edition, vol. 8, no. 4, pp. 406–424, 2004. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  14. D. Lunga, S. Prasad, M. M. Crawford, and O. Ersoy, “Manifold-learning-based feature extraction for classification of hyperspectral data: a review of advances in manifold learning,” IEEE Signal Processing Magazine, vol. 31, no. 1, pp. 55–66, 2014. View at Publisher · View at Google Scholar · View at Scopus
  15. J. C. Nascimento, J. G. Silva, J. S. Marques, and J. M. Lemos, “Manifold learning for object tracking with multiple nonlinear models,” IEEE Transactions on Image Processing, vol. 23, no. 4, pp. 1593–1605, 2014. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  16. V. de Silva and J. B. Tenenbaum, “Sparse multidimensional scaling using landmark points,” Tech. Rep., Stanford University, 2004. View at Google Scholar
  17. J. Venna and S. Kaski, “Local multidimensional scaling,” Neural Networks, vol. 19, no. 6-7, pp. 889–899, 2006. View at Publisher · View at Google Scholar · View at Scopus
  18. J. A. Lee and M. Verleysen, “Nonlinear dimensionality reduction of data manifolds with essential loops,” Neurocomputing, vol. 67, no. 1–4, pp. 29–53, 2005. View at Publisher · View at Google Scholar · View at Scopus
  19. D. Y. Meng, Y. Leung, T. Fung, and Z. B. Xu, “Nonlinear dimensionality reduction of data lying on the multicluster manifold,” IEEE Transactions on Systems, Man, and Cybernetics Part B, vol. 38, no. 4, pp. 1111–1122, 2008. View at Publisher · View at Google Scholar · View at Scopus
  20. D. Meng, Y. Leung, and Z. Xu, “Passage method for nonlinear dimensionality reduction of data on multi-cluster manifolds,” Pattern Recognition, vol. 46, no. 8, pp. 2175–2186, 2013. View at Publisher · View at Google Scholar · View at Scopus
  21. L. Yang, “Building k-connected neighborhood graphs for isometric data embedding,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 28, no. 5, pp. 827–831, 2006. View at Publisher · View at Google Scholar · View at Scopus
  22. I. T. Jolliffe, Principal Component Analysis, Springer, New York, NY, USA, 1989. View at MathSciNet
  23. T. F. Cox and M. A. A. Cox, Multidimensional Scaling, Chapman & Hall, 2nd edition, 2001.
  24. J. G. Silva, J. S. Marques, and J. M. Lemos, “Selecting landmark points for sparse manifold learning,” in Proceedings of the Neural Information Processing Systems (NIPS '05), 2005.
  25. D. L. Donoho and C. Grimes, “Hessian eigenmaps: locally linear embedding techniques for high-dimensional data,” Proceedings of the National Academy of Sciences of the United States of America, vol. 100, no. 10, pp. 5591–5596, 2003. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  26. M. Brückner, Large margin Kernel machines for binary classification [Diplom thesis], 2005.
  27. A. Y. Alfakih, A. Khandani, and H. Wolkowicz, “Solving euclidean distance matrix completion problems via semidefinite programming,” Computational Optimization and Applications, vol. 12, no. 1–3, pp. 13–30, 1999. View at Publisher · View at Google Scholar · View at MathSciNet
  28. M. Balasubramanian and E. L. Schwartz, “The isomap algorithm and topological stability,” Science, vol. 295, no. 5552, p. 7, 2002. View at Publisher · View at Google Scholar · View at Scopus
  29. J. Wang, Z. Zhang, and H. Zha, “Adaptive manifold learning,” in Advances in Neural Information Processing Systems: Proceedings of the NIPs, vol. 17, pp. 1473–1480, MIT Press, 2005. View at Google Scholar