Table of Contents Author Guidelines Submit a Manuscript
Mathematical Problems in Engineering
Volume 2015, Article ID 759567, 21 pages
http://dx.doi.org/10.1155/2015/759567
Review Article

Intrinsic Dimension Estimation: Relevant Techniques and a Benchmark Framework

1Dipartimento di Informatica, Università degli Studi di Milano, Via Comelico 39, 20135 Milano, Italy
2Research Group, Hyera Software, Via Mattei 2, Coccaglio, 25030 Brescia, Italy

Received 25 February 2015; Accepted 17 May 2015

Academic Editor: Sangmin Lee

Copyright © 2015 P. Campadelli et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Linked References

  1. R. S. Bennett, “The intrinsic dimensionality of signal collections,” IEEE Transactions on Information Theory, vol. 15, no. 5, pp. 517–525, 1969. View at Google Scholar
  2. C. M. Bishop, Neural Networks for Pattern Recognition, Oxford University Press, Oxford, UK, 1995. View at MathSciNet
  3. E. Chávez, G. Navarro, R. Baeza-Yates, and J. L. Marroquín, “Searching in metric spaces,” ACM Computing Surveys, vol. 33, no. 3, pp. 273–321, 2001. View at Publisher · View at Google Scholar · View at Scopus
  4. V. Pestov, “An axiomatic approach to intrinsic dimension of a dataset,” Neural Networks, vol. 21, no. 2-3, pp. 204–213, 2008. View at Publisher · View at Google Scholar · View at Scopus
  5. V. Pestov, “Intrinsic dimensionality,” SIGSPATIAL Special, vol. 2, no. 2, pp. 8–11, 2010. View at Publisher · View at Google Scholar
  6. M. Katetov and P. Simon, “Origins of dimension theory,” in Handbook of the History of General Topology, vol. 1, 1997. View at Google Scholar · View at MathSciNet
  7. B. Kégl, “Intrinsic dimension estimation using packing numbers,” in Proceedings of the Neural Information Processing Systems (NIPS '02), S. Becker, S. Thrun, and K. Obermayer, Eds., pp. 681–688, MIT Press, 2002.
  8. Z. Zhang and H. Zha, “Adaptive manifold learning,” in Advances in Neural Information Processing Systems, vol. 17, 2005. View at Google Scholar
  9. M. Gashler and T. Martinez, “Tangent space guided intelligent neighbor finding,” in Proceedings of the International Joint Conference on Neural Network (IJCNN '11), pp. 2617–2624, August 2011. View at Publisher · View at Google Scholar · View at Scopus
  10. M. Gashler and T. Martinez, “Robust manifold learning with CycleCut,” Connection Science, vol. 24, no. 1, pp. 57–69, 2012. View at Publisher · View at Google Scholar · View at Scopus
  11. P. Zhang, H. Qiao, and B. Zhang, “An improved local tangent space alignment method for manifold learning,” Pattern Recognition Letters, vol. 32, no. 2, pp. 181–189, 2011. View at Publisher · View at Google Scholar · View at Scopus
  12. N. Verma, “Distance preserving embeddings for general n-dimensional manifolds,” Journal of Machine Learning Research, vol. 14, pp. 2415–2448, 2013. View at Google Scholar · View at MathSciNet
  13. R. E. Bellman, Adaptive Control Processes: A Guided Tour, Princeton University Press, Princeton, NJ, USA, 1961. View at MathSciNet
  14. M. Kirby, Geometric Data Analysis: An Empirical Approach to Dimensionality Reduction and the Study of Patterns, John Wiley & Sons, 2001. View at MathSciNet
  15. I. T. Jolliffe, Principal Component Analysis, Springer Series in Statistics, Springer, New York, NY, USA, 1986. View at Publisher · View at Google Scholar · View at MathSciNet
  16. V. N. Vapnik, Statistical Learning Theory, John Wiley & Sons, 1998. View at MathSciNet
  17. J. H. Friedman, T. Hastie, and R. Tibshirani, The Elements of Statistical Learning—Data Mining, Inference and Prediction, Springer, Berlin, Germany, 2009.
  18. P. Campadelli, E. Casiraghi, C. Ceruti, G. Lombardi, and A. Rozza, “Local intrinsic dimensionality based features for clustering,” in Image Analysis and Processing—ICIAP 2013, A. Petrosino, Ed., vol. 8156 of Lecture Notes in Computer Science, pp. 41–50, Springer, Berlin, Germany, 2013. View at Publisher · View at Google Scholar
  19. P. Grassberger and I. Procaccia, “Measuring the strangeness of strange attractors,” Physica D. Nonlinear Phenomena, vol. 9, no. 1-2, pp. 189–208, 1983. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  20. H. Lähdesmäki, O. Yli-Harja, W. Zhang, and I. Shmulevich, “Intrinsic dimensionality in gene expression analysis,” in Proceedings of the International Workshop on Genomic Signal Processing and Statistics (GENSIPS '05), September 2005.
  21. F. Camastra and M. Filippone, “A comparative evaluation of nonlinear dynamics methods for time series prediction,” Neural Computing and Applications, vol. 18, no. 8, pp. 1021–1029, 2009. View at Publisher · View at Google Scholar · View at Scopus
  22. M. Valle and A. R. Oganov, “Crystal fingerprint space—a novel paradigm for studying crystal-structure sets,” Acta Crystallographica Section A, vol. 66, no. 5, pp. 507–517, 2010. View at Publisher · View at Google Scholar · View at Scopus
  23. K. M. Carter, R. Raich, and A. O. Hero, “On local intrinsic dimension estimation and its applications,” IEEE Transactions on Signal Processing, vol. 58, no. 2, pp. 650–663, 2010. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  24. J. Lapuyade-Lahorgue and A. Mohammad-Djafari, “Nearest neighbors and correlation dimension for dimensionality estimation. Application to factor analysis of real biological time series data,” in Proceedings of The European Symposium on Artificial Neural Networks (ESANN ’11), pp. 363–368, Bruges, Belgium, April 2014. View at Scopus
  25. R. Heylen and P. Scheunders, “Hyperspectral intrinsic dimensionality estimation with nearest-neighbor distance ratios,” IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, vol. 6, no. 2, pp. 570–579, 2013. View at Publisher · View at Google Scholar
  26. K. W. Pettis, T. A. Bailey, A. K. Jain, and R. C. Dubes, “An intrinsic dimensionality estimator from near-neighbor information,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 1, no. 1, pp. 25–37, 1979. View at Google Scholar · View at Scopus
  27. E. Levina and P. J. Bickel, “Maximum likelihood estimation of intrinsic dimension,” in Proceedings of the NIPS, vol. 1, pp. 777–784, 2004.
  28. F. Camastra and A. Vinciarelli, “Estimating the intrinsic dimension of data with a fractal-based method,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, no. 10, pp. 1404–1407, 2002. View at Publisher · View at Google Scholar · View at Scopus
  29. J. A. Costa and A. O. Hero, “Geodesic entropic graphs for dimension and entropy estimation in manifold learning,” IEEE Transactions on Signal Processing, vol. 52, no. 8, pp. 2210–2221, 2004. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  30. J. A. Costa and A. O. Hero, “Learning intrinsic dimension and entropy of high-dimensional shape spaces,” in Proceedings of the European Signal Processing Conference (EUSIPCO '04), pp. 231–252, September 2004.
  31. K. S. Beyer, J. Goldstein, R. Ramakrishnan, and U. Shaft, “When is ‘nearest neighbor’ meaningful?” in Proceedings of the 7th International Conference on Database Theory (ICDT '99), pp. 217–235, Springer, London, UK, 1999.
  32. K. M. Carter, R. Raich, W. G. Finn, and A. O. Hero III, “FINE: fisher information nonparametric embedding,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 31, no. 11, pp. 2093–2098, 2009. View at Publisher · View at Google Scholar · View at Scopus
  33. A. M. Farahmand, C. Szepesvári, and J.-Y. Audibert, “Manifold-adaptive dimension estimation,” in Proceedings of the 24th international conference on Machine learning (ICML ’07), pp. 265–272, June 2007. View at Publisher · View at Google Scholar · View at Scopus
  34. J. A. Scheinkman and B. LeBaron, “Nonlinear dynamics and stock returns,” The Journal of Business, vol. 62, no. 3, pp. 311–337, 1989. View at Publisher · View at Google Scholar
  35. D. R. Chialvo, R. F. Gilmour Jr., and J. Jalife, “Low dimensional chaos in cardiac tissue,” Nature, vol. 343, no. 6259, pp. 653–657, 1990. View at Publisher · View at Google Scholar · View at Scopus
  36. A. Mekler, “Calculation of eeg correlation dimension: large massifs of experimental data,” Computer Methods and Programs in Biomedicine, vol. 92, no. 1, pp. 154–160, 2008. View at Publisher · View at Google Scholar · View at Scopus
  37. G. N. Derry and P. S. Derry, “Age dependence of the menstrual cycle correlation dimension,” Open Journal of Biophysics, vol. 2, no. 2, pp. 40–45, 2012. View at Publisher · View at Google Scholar
  38. V. Isham, Statistical Aspects of Chaos: A Review, Chapman and Hall, London, UK, 1993.
  39. S. Haykin and X. B. Li, “Detection of signals in chaos,” Proceedings of the IEEE, vol. 83, no. 1, pp. 95–122, 1995. View at Publisher · View at Google Scholar · View at Scopus
  40. P. Somervuo, “Speech dimensionality analysis on hypercubical self-organizing maps,” Neural Processing Letters, vol. 17, no. 2, pp. 125–136, 2003. View at Publisher · View at Google Scholar · View at Scopus
  41. B. Hu, T. Rakthanmanon, Y. Hao, S. Evans, S. Lonardi, and E. Keogh, “Towards discovering the intrinsic cardinality and dimensionality of time series using MDL,” in Algorithmic Probability and Friends. Bayesian Prediction and Artificial Intelligence, vol. 7070 of Lecture Notes in Computer Science, pp. 184–197, Springer, Berlin, Germany, 2013. View at Publisher · View at Google Scholar
  42. D. C. Laughlin, “The intrinsic dimensionality of plant traits and its relevance to community assembly,” Journal of Ecology, vol. 102, no. 1, pp. 186–193, 2014. View at Publisher · View at Google Scholar · View at Scopus
  43. F. Camastra, “Data dimensionality estimation methods: a survey,” Pattern Recognition, vol. 36, no. 12, pp. 2945–2954, 2003. View at Publisher · View at Google Scholar · View at Scopus
  44. A. K. Romney, R. N. Shepard, and S. B. Nerlove, Multidimensionaling Scaling, Volume I: Theory, Seminar Press, 1972.
  45. A. K. Romney, R. N. Shepard, and S. B. Nerlove, Multidimensionaling Scaling, Volume II: Applications, Seminar Press, 1972.
  46. T. Lin and H. Zha, “Riemannian manifold learning,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 30, no. 5, pp. 796–809, 2008. View at Publisher · View at Google Scholar · View at Scopus
  47. R. N. Shepard, “The analysis of proximities: multidimensional scaling with an unknown distance function. Part I,” Psychometrika, vol. 27, pp. 125–140, 1962. View at Google Scholar · View at MathSciNet
  48. R. N. Shepard, “The analysis of proximities: multidimensional scaling with an unknown distance function, part II,” Psychometrika, vol. 27, pp. 219–246, 1962. View at Google Scholar · View at MathSciNet
  49. J. B. Kruskal, “Multidimensional scaling by optimizing goodness of fit to a nonmetric hypothesis,” Psychometrika, vol. 29, pp. 1–27, 1964. View at Google Scholar · View at MathSciNet
  50. J. B. Kruskal and J. D. Carrol, Geometrical Models and Badness-of-Fit Functions, vol. 2, Academic Press, 1969.
  51. R. N. Shepard and J. D. Carroll, Parametric Representation of Nonlinear Data Structures, Academic Press, New York, NY, USA, 1969.
  52. J. B. Kruskal, Linear Transformation of Multivariate Data to Reveal Clustering, vol. 1, Academic Press, New York, NY, USA, 1972.
  53. C. K. Chen and H. C. Andrews, “Nonlinear intrinsic dimensionality computations,” IEEE Transactions on Computers, vol. C-23, no. 2, pp. 178–184, 1974. View at Publisher · View at Google Scholar
  54. J. W. J. Sammon, “A nonlinear mapping for data structure analysis,” IEEE Transactions on Computers, vol. 18, pp. 401–409, 1969. View at Google Scholar
  55. P. Demartines and J. Hérault, “Curvilinear component analysis: A self-organizing neural network for nonlinear mapping of data sets,” IEEE Transactions on Neural Networks, vol. 8, no. 1, pp. 148–154, 1997. View at Publisher · View at Google Scholar · View at Scopus
  56. J. B. Tenenbaum, V. de Silva, and J. C. Langford, “A global geometric framework for nonlinear dimensionality reduction,” Science, vol. 290, no. 5500, pp. 2319–2323, 2000. View at Publisher · View at Google Scholar · View at Scopus
  57. S. T. Roweis and L. K. Saul, “Nonlinear dimensionality reduction by locally linear embedding,” Science, vol. 290, no. 5500, pp. 2323–2326, 2000. View at Publisher · View at Google Scholar · View at Scopus
  58. J. A. Lee and M. Verleysen, Nonlinear Dimensionality Reduction, Springer, New York, NY, USA, 2007.
  59. R. Karbauskaite, G. Dzemyda, and E. Mazetis, “Geodesic distances in the maximum likelihood estimator of intrinsic dimensionality,” Nonlinear Analysis: Modelling and Control, vol. 16, no. 4, pp. 387–402, 2011. View at Google Scholar · View at MathSciNet
  60. M. Polito and P. Perona, “Grouping and dimensionality reduction by locally linear embedding,” Advances in Neural Information Processing Systems, vol. 14, pp. 1255–1262, 2001. View at Google Scholar
  61. B. Schölkopf, A. Smola, and K.-R. Müller, “Nonlinear component analysis as a kernel eigenvalue problem,” Neural Computation, vol. 10, no. 5, pp. 1299–1319, 1998. View at Publisher · View at Google Scholar · View at Scopus
  62. K. Fukunaga and D. R. Olsen, “An algorithm for finding intrinsic dimensionality of data,” IEEE Transactions on Computers, vol. 20, no. 2, pp. 176–183, 1971. View at Publisher · View at Google Scholar · View at Scopus
  63. P. J. Verveer and R. P. W. Duin, “An evaluation of intrinsic dimensionality estimators,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 17, no. 1, pp. 81–86, 1995. View at Publisher · View at Google Scholar · View at Scopus
  64. J. Brüske and G. Sommer, “Intrinsic dimensionality estimation with optimally topology preserving maps,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 20, no. 5, pp. 572–575, 1998. View at Publisher · View at Google Scholar · View at Scopus
  65. T. Martinetz and K. Schulten, “Topology representing networks,” Neural Networks, vol. 7, no. 3, pp. 507–522, 1994. View at Publisher · View at Google Scholar · View at Scopus
  66. M. E. Tipping and C. M. Bishop, “Probabilistic principal component analysis,” Journal of the Royal Statistical Society. Series B. Statistical Methodology, vol. 61, no. 3, pp. 611–622, 1999. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  67. R. Everson and S. Roberts, “Inferring the eigenvalues of covariance matrices from limited, noisy data,” IEEE Transactions on Signal Processing, vol. 48, no. 7, pp. 2083–2091, 2000. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  68. C. M. Bishop, “Bayesian PCA,” in Proceedings of the 12th Annual Conference on Neural Information Processing Systems (NIPS ’98), pp. 382–388, December 1998. View at Scopus
  69. J. J. Rajan and P. J. W. Rayner, “Model order selection for the singular value decomposition and the discrete Karhunen-Loeve transform using a Bayesian approach,” IEE Proceedings—Vision, Image and Signal Processing, vol. 144, no. 2, pp. 116–123, 1997. View at Google Scholar
  70. T. P. Minka, “Automatic choice of dimensionality for PCA,” Tech. Rep. 514, MIT, 2000. View at Google Scholar
  71. C. Bouveyron, G. Celeux, and S. Girard, “Intrinsic dimension estimation by maximum likelihood in isotropic probabilistic PCA,” Pattern Recognition Letters, vol. 32, no. 14, pp. 1706–1713, 2011. View at Publisher · View at Google Scholar · View at Scopus
  72. J. Li and D. Tao, “Simple exponential family PCA,” in Proceedings of the 13th International Conference on Artificial Intelligence and Statistics (AISTATS '10), pp. 453–460, Sardinia, Italy, May 2010.
  73. Y. Guan and J. G. Dy, “Sparse probabilistic principal component analysis,” Journal of Machine Learning Research, vol. 5, pp. 185–192, 2009. View at Google Scholar · View at Scopus
  74. H. Zou, T. Hastie, and R. Tibshirani, “Sparse principal component analysis,” Journal of Computational and Graphical Statistics, vol. 15, no. 2, pp. 265–286, 2006. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  75. C. M. Bishop, Pattern Recognition and Machine Learning, Springer, New York, NY, USA, 2nd edition, 2006. View at Publisher · View at Google Scholar · View at MathSciNet
  76. C. Ceruti, S. Bassis, A. Rozza, G. Lombardi, E. Casiraghi, and P. Campadelli, “DANCo: an intrinsic dimensionality estimator exploiting angle and norm concentration,” Pattern Recognition, vol. 47, no. 8, pp. 2569–2581, 2014. View at Publisher · View at Google Scholar · View at Scopus
  77. A. V. Little, M. Maggioni, and L. Rosasco, “Multiscale geometric methods for data sets I: multiscale SVD, noise and curvature,” MIT-CSAIL-TR 2012-029, 2012. View at Google Scholar
  78. F. G. Kaslovsky and D. N. Meyer, “Optimal tangent plane recovery from noisy manifold samples,” http://xxx.tau.ac.il/abs/1111.4601v2.
  79. M. Hein and J. Y. Audibert, “Intrinsic dimensionality estimation of submanifolds in euclidean space,” in Proceedings of the International Conference on Machine Learning (ICML '05), pp. 289–296, 2005.
  80. G. Haro, G. Randall, and G. Sapiro, “Translated poisson mixture model for stratification learning,” International Journal of Computer Vision, vol. 80, no. 3, pp. 358–374, 2008. View at Publisher · View at Google Scholar · View at Scopus
  81. M. Chen, J. Silva, J. Paisley, C. Wang, D. Dunson, and L. Carin, “Compressive sensing on manifolds using a nonparametric mixture of factor analyzers: algorithm and performance bounds,” IEEE Transactions on Signal Processing, vol. 58, no. 12, pp. 6140–6155, 2010. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  82. L. Brouwer, Collected Works, Volume I, Philosophy and Foundations of Mathematics and II, Geometry, Analysis, Topology and Mechanics, North-Holland/American Elsevier, 1976.
  83. I. M. James, History of Topology, Mathematics, Elsevier, 1999.
  84. G. Medioni and P. Mordohai, “The tensor voting framework,” in Emerging Topics in Computer Vision, pp. 191–255, Prentice Hall, 2004. View at Google Scholar
  85. G. Lombardi, E. Casiraghi, and P. Campadelli, “Curvature estimation and curve inference with tensor voting: a new approach,” in Proceedings of the 10th International Conference on Advanced Concepts for Intelligent Vision Systems (ACIVS '08), vol. 5259, pp. 613–624, 2008.
  86. M. Wertheimer, “Untersuchungen zur Lehre von der Gestalt II,” in Psycologische Forshung, vol. 4, pp. 301–350, 1923, Translation: A Source Book of Gestalt Psychology. View at Google Scholar
  87. P. Mordohai and G. Medioni, “Dimensionality estimation, manifold learning and function approximation using tensor voting,” Journal of Machine Learning Research, vol. 11, pp. 411–450, 2010. View at Google Scholar · View at MathSciNet
  88. J. C. Robinson, Dimensions, Embeddings, and Attractors, Cambridge Tracts in Mathematics, Cambridge University Press, 2010.
  89. C.-G. Li, J. Guo, and B. Xiao, “Intrinsic dimensionality estimation within neighborhood convex hull,” International Journal of Pattern Recognition and Artificial Intelligence, vol. 23, no. 1, pp. 31–44, 2009. View at Publisher · View at Google Scholar · View at Scopus
  90. K. Falconer, Fractal Geometry—Mathematical Foundations and Applications, John Wiley & Sons, 2nd edition, 2003. View at Publisher · View at Google Scholar · View at MathSciNet
  91. N. Tatti, T. Mielikäinen, A. Gionis, and H. Mannila, “What is the dimension of your binary data?” in Proceedings of the 6th International Conference on Data Mining (ICDM' 06), pp. 603–612, December 2006. View at Publisher · View at Google Scholar · View at Scopus
  92. J.-P. Eckmann and D. Ruelle, “Fundamental limitations for estimating dimensions and Lyapunov exponents in dynamical systems,” Physica D. Nonlinear Phenomena, vol. 56, no. 2-3, pp. 185–187, 1992. View at Publisher · View at Google Scholar · View at MathSciNet
  93. F. Takens, “On the numerical determination of the dimension of an attractor,” in Dynamical Systems and Bifurcations, B. J. Braaksma, H. W. Broer, and F. Takens, Eds., vol. 1125 of Lecture Notes in Mathematics, pp. 99–106, Springer, Berlin, Germany, 1985. View at Publisher · View at Google Scholar · View at MathSciNet
  94. Y. Ashkenazy, “The use of generalized information dimension in measuring fractal dimension of time series,” Physica A: Statistical Mechanics and Its Applications, vol. 271, no. 3-4, pp. 427–447, 1999. View at Publisher · View at Google Scholar · View at Scopus
  95. C. Tricot Jr., “Two definitions of fractional dimension,” Mathematical Proceedings of the Cambridge Philosophical Society, vol. 91, no. 1, pp. 57–74, 1982. View at Publisher · View at Google Scholar · View at MathSciNet
  96. M. R. Brito, A. J. Quiroz, and J. E. Yukich, “Intrinsic dimension identification via graph-theoretic methods,” Journal of Multivariate Analysis, vol. 116, pp. 263–277, 2013. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  97. M. Raginsky and S. Lazebnik, “Estimation of intrinsic dimensionality using high-rate vector quantization,” in Proceedings of the NIPS, pp. 1105–1112, 2005.
  98. P. L. Zador, “Asymptotic quantization error of continuous signals and the quantization dimension,” IEEE Transactions on Information Theory, vol. 28, no. 2, pp. 139–149, 1982. View at Publisher · View at Google Scholar · View at MathSciNet
  99. K. Kumaraswamy, V. Megalooikonomou, and C. Faloutsos, “Fractal dimension and vector quantization,” Information Processing Letters, vol. 91, no. 3, pp. 107–113, 2004. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet · View at Scopus
  100. G. V. Trunk, “Statistical estimation of the intrinsic dimensionality of a noisy signal collection,” IEEE Transactions on Computers, vol. 25, no. 2, pp. 165–171, 1976. View at Google Scholar · View at MathSciNet
  101. M. Fan, H. Qiao, and B. Zhang, “Intrinsic dimension estimation of manifolds by incising balls,” Pattern Recognition, vol. 42, no. 5, pp. 780–787, 2009. View at Publisher · View at Google Scholar · View at Scopus
  102. D. MacKay and Z. Ghahramani, “Comments on maximum likelihood estimation of intrinsic dimension by E. Levina and P. Bickel,” 2005, http://www.inference.phy.cam.ac.uk/mackay/dimension/.
  103. M. D. Penrose and J. E. Yukich, “Limit theory for point processes in manifolds,” The Annals of Applied Probability, vol. 23, no. 6, pp. 2161–2211, 2013. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  104. P. J. Bickel and D. Yan, “Sparsity and the possibility of inference,” Sankhya: The Indian Journal of Statistics, vol. 70, no. 1, 23 pages, 2008. View at Google Scholar
  105. M. Das Gupta and T. S. Huang, “Regularized maximum likelihood for intrinsic dimension estimation,” in Proceedings of the 26th Conference on Uncertainty in Artificial Intelligence (UAI '10), P. Grünwald and P. Spirtes, Eds., pp. 220–227, AUAI Press, Catalina Island, Calif, USA, July 2010.
  106. R. Karbauskaite and G. Dzemyda, “Investigation of the maximum likelihood estimator of intrinsic dimensionality,” in Proceedings of the 10th International Conference on Computer Data Analysis and Modeling, vol. 2, pp. 110–113, 2013.
  107. A. Rozza, G. Lombardi, C. Ceruti, E. Casiraghi, and P. Campadelli, “Novel high intrinsic dimensionality estimators,” Machine Learning, vol. 89, no. 1-2, pp. 37–65, 2012. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet · View at Scopus
  108. Q. Wang, S. R. Kulkarni, and S. Verdú, “A nearest-neighbor approach to estimating divergence between continuous random vectors,” in Proceedings of the IEEE International Symposium on Information Theory (ISIT '06), pp. 242–246, July 2006. View at Publisher · View at Google Scholar · View at Scopus
  109. K. V. Mardia, Statistics of Directional Data, Academic Press, 1972. View at MathSciNet
  110. A. J. Quiroz, “Graph-theoretical methods,” in Encyclopedia of Statistical Sciences, vol. 5, Wiley and Sons, New York, NY, USA, 2006. View at Google Scholar
  111. A. O. Hero III, B. Ma, O. J. J. Michel, and J. Gorman, “Applications of entropic spanning graphs,” IEEE Signal Processing Magazine, vol. 19, no. 5, pp. 85–95, 2002. View at Publisher · View at Google Scholar · View at Scopus
  112. J. A. Costa, A. Girotra, and A. O. Hero III, “Estimating local intrinsic dimension with k-nearest neighbor graphs,” in Proceedings of the IEEE/SP 13th Workshop on Statistical Signal Processing, pp. 417–421, July 2005. View at Scopus
  113. J. H. Friedman and L. C. Rafsky, “Graph-theoretic measures of multivariate association and prediction,” Annals of Statistics, vol. 11, no. 2, pp. 377–391, 1983. View at Publisher · View at Google Scholar · View at MathSciNet
  114. M. D. Penrose and J. E. Yukich, “Central limit theorems for some graphs in computational geometry,” The Annals of Applied Probability, vol. 11, no. 4, pp. 1005–1041, 2001. View at Publisher · View at Google Scholar · View at MathSciNet
  115. M. R. Brito, A. J. Quiroz, and J. E. Yukich, “Graph-theoretic procedures for dimension identification,” Journal of Multivariate Analysis, vol. 81, no. 1, pp. 67–84, 2002. View at Publisher · View at Google Scholar · View at MathSciNet
  116. J. M. Steele, L. A. Shepp, and W. F. Eddy, “On the number of leaves of a Euclidean minimal spanning tree,” Journal of Applied Probability, vol. 24, no. 4, pp. 809–826, 1987. View at Publisher · View at Google Scholar · View at MathSciNet
  117. M. F. Schilling, “Mutual and shared neighbor probabilities: finite- and infinite-dimensional results,” Advances in Applied Probability, vol. 18, no. 2, pp. 388–405, 1986. View at Publisher · View at Google Scholar · View at MathSciNet
  118. K. Sricharan, R. Raich, and A. O. Hero III, “Optimized intrinsic dimension estimator using nearest neighbor graphs,” in Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP ’10), pp. 5418–5421, Dallas, Tex, USA, March 2010. View at Publisher · View at Google Scholar · View at Scopus
  119. Y. LeCun, L. Bottou, Y. Bengio, and P. Haffner, “Gradient-based learning applied to document recognition,” Proceedings of the IEEE, vol. 86, no. 11, pp. 2278–2324, 1998. View at Publisher · View at Google Scholar · View at Scopus
  120. A. Frank and A. Asuncion, UCI Machine Learning Repository, UCI, 2010.
  121. F. Pineda and J. Sommerer, “Estimating generalized dimensions and choosing time delays: a fast algorithm,” in Time Series Prediction: Forecasting the Future and Understanding the Past, pp. 367–385, 1994. View at Google Scholar
  122. J. A. Costa and A. O. Hero, Determining Intrinsic Dimension and Entropy of High-Dimensional Shape Spaces, Birkhäuser, Boston, Mass, USA, 2006.
  123. I. Kivimäki, K. Lagus, I. Nieminen, J. Väyrynen, and T. Honkela, “Using correlation dimension for analysing text data,” in Artificial Neural Networks—ICANN 2010: Proceedings of the 20th International Conference, Thessaloniki, Greece, September 15–18, 2010, Part I, vol. 6352 of Lecture Notes in Computer Science, pp. 368–373, Springer, Berlin, Germany, 2010. View at Publisher · View at Google Scholar
  124. E. Ott, Chaos in Dynamical Systems, Cambridge University Press, Cambridge, UK, 1993. View at Publisher · View at Google Scholar · View at MathSciNet
  125. L. O. Chua, M. Komuro, and T. Matsumoto, “The double scroll,” IEEE Transactions on Circuits and Systems, vol. 32, no. 8, pp. 797–818, 1985. View at Publisher · View at Google Scholar · View at MathSciNet
  126. G. Lombardi, A. Rozza, C. Ceruti, E. Casiraghi, and P. Campadelli, “Minimum neighbor distance estimators of intrinsic dimension,” in Machine Learning and Knowledge Discovery in Databases: Proceedings of the European Conference, ECML PKDD 2011, Athens, Greece, September 5–9, 2011, Part II, vol. 6912 of Lecture Notes in Computer Science, pp. 374–389, Springer, Berlin, Germany, 2011. View at Publisher · View at Google Scholar
  127. J. Jaccard, M. A. Becker, and G. Wood, “Pairwise multiple comparison procedures: a review,” Psychological Bulletin, vol. 96, no. 3, pp. 589–596, 1984. View at Publisher · View at Google Scholar · View at Scopus
  128. D. Gong, X. Zhao, and G. Medioni, “Robust multiple manifolds structure learning,” in Proceedings of the 29th International Conference on Machine Learning (ICML' 12), pp. 321–328, July 2012. View at Scopus
  129. J. Wei, H. Peng, Y.-S. Lin, Z.-M. Huang, and J.-B. Wang, “Adaptive neighborhood selection for manifold learning,” in Proceedings of the International Conference on Machine Learning and Cybernetics (ICMLC '08), vol. 1, pp. 380–384, IEEE, Kunming, China, July 2008. View at Publisher · View at Google Scholar · View at Scopus