Table of Contents Author Guidelines Submit a Manuscript
Research Letters in Signal Processing
Volume 2008, Article ID 790607, 5 pages
http://dx.doi.org/10.1155/2008/790607
Research Letter

Generalized Cumulative Residual Entropy for Distributions with Unrestricted Supports

Lab-STICC (CNRS FRE 3167), Institut Telecom, Telecom Bretagne, Technopole Brest Iroise, CS 83818, 29238 Brest Cédex, France

Received 6 April 2008; Accepted 19 June 2008

Academic Editor: Andreas Jakobsson

Copyright © 2008 Noomane Drissi et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Linked References

  1. C. E. Shannon, “A mathematical theory of communication,” Bell System Technical Journal, vol. 27, pp. 379–423, 1948. View at Google Scholar
  2. J. M. Cover and J. H. Thomas, Elements of Information Theory, Wiley Interscience, New York, NY, USA, 2006.
  3. F. Wang, B. C. Vemuri, M. Rao, and Y. Chen, “A new & robust information theoretic measure and its application to image alignment,” in Proceedings of the 18th International Conference on Information Processing in Medical Imaging (IPMI '03), vol. 2732 of Lecture Notes in Computer Science, pp. 388–400, Springer, Ambleside, UK, July 2003.
  4. M. Rao, Y. Chen, B. C. Vemuri, and F. Wang, “Cumulative residual entropy: a new measure of information,” IEEE Transactions on Information Theory, vol. 50, no. 6, pp. 1220–1228, 2004. View at Publisher · View at Google Scholar
  5. M. Asadi and Y. Zohrevand, “On the dynamic cumulative residual entropy,” Journal of Statistical Planning and Inference, vol. 137, no. 6, pp. 1931–1941, 2007. View at Publisher · View at Google Scholar
  6. K. Zografos and S. Nadarajah, “Survival exponential entropies,” IEEE Transactions on Information Theory, vol. 51, no. 3, pp. 1239–1246, 2005. View at Publisher · View at Google Scholar
  7. M. Rao, “More on a new concept of entropy and information,” Journal of Theoretical Probability, vol. 18, no. 4, pp. 967–981, 2005. View at Publisher · View at Google Scholar
  8. J. N. Kapur, Maximum Entropy Methods in Science and Engineering, John Wiley & Sons, New York, NY, USA, 1989.
  9. J. L. Troutman, Variational Calculus and Optimal Control, Optimization with Elementary Convexity, Springer, New York, NY, USA, 2nd edition, 1996.
  10. D. T. Pham, “Fast algorithm for estimating mutual information, entropies and score functions,” in Proceedings of the 4th International Symposium on Independent Component Analysis and Blind Signal Separation (ICA '03), pp. 17–22, Nara, Japan, April 2003.