Table of Contents Author Guidelines Submit a Manuscript
Mathematical Problems in Engineering
Volume 2015, Article ID 201874, 12 pages
http://dx.doi.org/10.1155/2015/201874
Research Article

A Unified Definition of Mutual Information with Applications in Machine Learning

Elevate, 4150 International Plaza, Fort Worth, TX 76109, USA

Received 21 December 2014; Revised 16 March 2015; Accepted 17 March 2015

Academic Editor: Zexuan Zhu

Copyright © 2015 Guoping Zeng. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Linked References

  1. G. D. Tourassi, E. D. Frederick, M. K. Markey, and C. E. Floyd Jr., “Application of the mutual information criterion for feature selection in computer-aided diagnosis,” Medical Physics, vol. 28, no. 12, pp. 2394–2402, 2001. View at Publisher · View at Google Scholar · View at Scopus
  2. I. Guyon and A. Elisseeff, “An introduction to variable and feature selection,” Journal of Machine Learning Research, vol. 3, pp. 1157–1182, 2003. View at Google Scholar · View at Scopus
  3. A. Navot, On the role of feature selection in machine learning [Ph.D. thesis], Hebrew University, 2006.
  4. C. E. Shannon, “A mathematical theory of communication,” The Bell System Technical Journal, vol. 27, no. 3, pp. 379–423, 1948. View at Publisher · View at Google Scholar · View at MathSciNet
  5. S. Kullback, Information Theory and Statistics, John Wiley & Sons, New York, NY, USA, 1959. View at MathSciNet
  6. M. S. Pinsker, Information and Information Stability of Random Variables and Processes, Academy of Science, USSR, 1960, (English Translation by A. Feinstein in 1964 and published by Holden-Day, San Francisco, USA).
  7. R. B. Ash, Information Theory, Interscience Publishers, New York, NY, USA, 1965.
  8. T. M. Cover and J. A. Thomas, Elements of Information Theory, John Wiley & Sons, New York, NY, USA, 2nd edition, 2006. View at MathSciNet
  9. R. M. Fano, Transmission of Information, MIT Press, Cambridge, Mass, USA; John Wiley & Sons, New York, NY, USA, 1961.
  10. N. Abramson, Information Theory and Coding, McGraw-Hill, New York, NY, USA, 1963. View at MathSciNet
  11. R. G. Gallager, Information Theory and Reliable Communication, John Wiley & Sons, New York, NY, USA, 1968.
  12. R. B. Ash and C. A. Doleans-Dade, Probability & Measure Theory, Academic Press, San Diego, Calif, USA, 2nd edition, 2000.
  13. I. Braga, “A constructive density-ratio approach to mutual information estimation: experiments in feature selection,” Journal of Information and Data Management, vol. 5, no. 1, pp. 134–143, 2014. View at Google Scholar
  14. L. Paninski, “Estimation of entropy and mutual information,” Neural Computation, vol. 15, no. 6, pp. 1191–1253, 2003. View at Publisher · View at Google Scholar · View at Scopus
  15. M. Refaat, Credit Risk Scorecards: Development and Implementation Using SAS, Lulu.com, New York, NY, USA, 2011.
  16. N. Siddiqi, Credit Risk Scorecards: Developing and Implementing Intelligent Credit Scoring, John Wiley & Sons, New York, NY, USA, 2006.
  17. G. Zeng, “Metric divergence measures and information value in credit scoring,” Journal of Mathematics, vol. 2013, Article ID 848271, 10 pages, 2013. View at Publisher · View at Google Scholar · View at MathSciNet
  18. G. Zeng, “A rule of thumb for reject inference in credit scoring,” Mathematical Finance Letters, vol. 2014, article 2, 2014. View at Google Scholar
  19. G. Zeng, “A necessary condition for a good binning algorithm in credit scoring,” Applied Mathematical Sciences, vol. 8, no. 65, pp. 3229–3242, 2014. View at Publisher · View at Google Scholar
  20. K. Kennedy, Credit scoring using machine learning [Ph.D. thesis], School of Computing, Dublin Institute of Technology, Dublin, Ireland, 2013.
  21. R. J. McEliece, The Theory of Information and Coding, Cambridge University Press, Cambridge, UK, Student edition, 2004. View at MathSciNet
  22. J. H. Friedman, “Greedy function approximation: a gradient boosting machine,” The Annals of Statistics, vol. 29, no. 5, pp. 1189–1232, 2001. View at Publisher · View at Google Scholar · View at MathSciNet