Table of Contents Author Guidelines Submit a Manuscript
International Journal of Mathematics and Mathematical Sciences
Volume 2015, Article ID 258675, 6 pages
http://dx.doi.org/10.1155/2015/258675
Research Article

Comparative Study of Generalized Quantitative-Qualitative Inaccuracy Fuzzy Measures for Noiseless Coding Theorem and 1:1 Codes

1Department of Mathematics, Amity Institute of Applied Sciences, Amity University, Noida 201301, India
2Department of Mathematics, Ideal Institute of Management and Technology, GGSIP University, Delhi 110092, India

Received 3 March 2015; Accepted 1 April 2015

Academic Editor: Ram U. Verma

Copyright © 2015 H. D. Arora and Anjali Dhiman. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

In coding theory, we study various properties of codes for application in data compression, cryptography, error correction, and network coding. The study of codes is introduced in Information Theory, electrical engineering, mathematics, and computer sciences for the transmission of data through reliable and efficient methods. We have to consider how coding of messages can be done efficiently so that the maximum number of messages can be sent over a noiseless channel in a given time. Thus, the minimum value of mean codeword length subject to a given constraint on codeword lengths has to be founded. In this paper, we have introduced mean codeword length of order and type for 1:1 codes and analyzed the relationship between average codeword length and fuzzy information measures for binary 1:1 codes. Further, noiseless coding theorem associated with fuzzy information measure has been established.

1. Introduction

In the 1940s, a new branch of mathematics was introduced as Information Theory. Information Theory considered the problems of how to process information, how to store information, how to retrieve information, and hence decision-making. Information Theory deals with the study that how information can be transmitted over communication channels. In 1924 and 1928, Nyquist [1, 2] first studied the information measures and Hartley [3] studied that information measures were logarithmic in nature. New properties of information sources and communication channels were published by Shannon [4] in his research paper “A Mathematical Theory of Communication.” Wiener [5] also started considering results similar to Shannon. A large number of applications of Information Theory in various fields of social, physical, and biological sciences have been developed, for example, economics, statistics, accounting, language, psychology, ecology, pattern recognition, computer sciences, and fuzzy sets. Later, Rényi [6] introduced entropy of order , which is the limiting case of Shannon entropy.

Fuzzy set theory was introduced by Zadeh [7]. It relates to the uncertainty occurring in the human cognitive processes. Fuzzy set theory has found applications in various engineering, biological sciences, and business. Zadeh introduced fuzzy entropy, a measure of fuzzy information based on Shannon’s entropy.

Let be a fuzzy set in the universe of discourse . We define membership function as , where is a degree of membership for each .

The following properties were given by De Luca and Termini [8] which are to be satisfied by fuzzy entropy.(1)Fuzzy entropy is the minimum iff set is crisp.(2)Fuzzy entropy is the maximum when membership value is 0.5.(3)Fuzzy entropy decreases if set is sharpened.(4)Fuzzy entropy of a set is the same as its complement.In coding theory, we study various properties of codes for application in data compression, cryptography, error correction, and network coding. The study of codes is introduced in Information Theory, electrical engineering, mathematics, and computer sciences for the transmission of data through reliable and efficient methods. We have to consider how coding of messages can be done efficiently so that the maximum number of messages can be sent over a noiseless channel in a given time. Thus, the minimum value of mean codeword length subject to a given constraint on codeword lengths has to be founded. Let be random variable with each value , , that generates messages represented by the set which need to be transmitted. This set is called code alphabet and sequence assigned to each is called codeword. The codeword length denoted by associated with satisfying Kraft’s inequality is given bywhere is the size of alphabet. Further, codes are chosen to minimize average codeword length which is given bywhere is the probability of the occurrence of . Now, Shannon’s [4] noiseless coding theorem for uniquely decipherable codes is given byfinding the lower bounds on in terms of Shannon’s entropy .

Campbell [9] proposed a special exponentiated mean codeword length of order for uniquely decipherable codes given byand proved a noiseless coding theorem. Campbell analyzed that its lower bound lies between and , where ; , .

The above is Renyi’s measure of entropy of order . As , it is easily shown that and approaches .

For uniquely decipherable code, a weighted average length was given by Guiasu and Picard [10]:This is defined as the average cost of transmitting letters with probability and utility .

A quantitative-qualitative measure of entropy was defined by Belis and Guiasu [11] for which Longo [12] gave the lower bound for useful mean codeword length. Further, a noiseless coding theorem was verified by Guiasu and Picard by introducing lower bound for useful mean codeword length. And Gurdial and Pessoa [13] proved noiseless coding theorem by giving lower bounds for useful mean codeword lengths of order in terms of useful measures of information of order . The following information measure was introduced by Belis and Guiasu:Similarly, quantitative-qualitative information measure was given by Taneja and Tuteja:Further, Bhaker and Hooda gave the following information measures:Now, Baig and Dar [14, 15] proposed a new considered fuzzy information measure of order and type :And for this information measure they introduced the given mean code length of order and type :Choudhary and Kumar [16] proved some noiseless coding theorem on generalized R-Norm entropy. Also, Choudhary and Kumar [17] proposed some coding theorems on generalized Havrda-Charvat and Tsallis entropy. Baig and Dar [14, 15] introduced few coding theorems on fuzzy entropy function depending upon parameters R and V and gave fuzzy coding theorem on generalized fuzzy cost measure. Taneja and Bhatia [18] proposed a generalized mean codeword length for the best 1:1 code and Parkash and Sharma [19, 20] proved some noiseless coding theorems corresponding to fuzzy entropies and introduced a new class of fuzzy coding theorems. Parkash [21] introduced a new parametric measure of fuzzy entropy. Gupta et al. [22] proposed 1:1 codes for generalized quantitative-qualitative measure of inaccuracy. Jain and Tuteja [23] introduced a coding theorem connected with useful entropy of order and type . Tuli [24] introduced mean codeword lengths and their correspondence with entropy measures. Tuli and Sharma [25] proved some new coding theorems and consequently developed some new weighted fuzzy mean codeword lengths corresponding to the well-known measures of weighted fuzzy entropy.

In the next section, we will prove the fuzzy noiseless theorem for 1:1 codes of binary size and hence prove that they are less constrained.

2. Fuzzy Noiseless Coding Theorem for 1:1 Codes

Theorem 1. The mean codeword length order and type for 1:1 binary code satisfy:where

Proof. Consider the following fuzzy information measure:And hence corresponding to the above fuzzy information measure we introduce the given mean codeword length for 1:1 code as follows: Using, Holder’s inequality,For all, , ; , , or , and .
Let , ,Then, (15) becomes Now, we use the following condition: Thus, the equation is reduced as follows: Now, we raise the power on both sides by , and we get Hence, we multiply both sides by , and we get That is, .
Hence, ; this proves the result.

Theorem 2. The mean codeword length order and type for 1:1 binary code satisfy the following:

Proof. We haveThat is,or or Hence, we multiply both sides by .
And summing over , we get or or Since for and after suitable operations, we getor we can write , which proves the result.

3. Conclusion

In modern data transmission and storage systems, the key ingredients that help in achieving the high degree of reliability are the error correcting codes. The noiseless channel faces the problem of efficient coding of messages and hence we have to maximize the number of messages to be sent through a channel in a given time. Thus, we have introduced mean codeword length of order and type for 1:1 codes and analyzed the relationship between average codeword length and fuzzy information measures for binary 1:1 codes. Further, noiseless coding theorems associated with fuzzy information measure have been established.

4. Future Research Endeavors

Decision-making problems in general utilize information from the knowledge base of experts in view of their perception about that system. Besides mathematical studies, algorithms for application in decision-making can also be discussed in extension of the above research. Comparison of other fuzzy information measures in the light of fuzzy noiseless theorem and 1:1 codes of binary size can also be proved and compared. Further, this fuzzy noiseless coding theorem can be utilized for characterizing and applying R-Norm entropy measures.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

References

  1. H. Nyquist, “Certain factors affecting telegraph speed,” Bell System Technical Journal, pp. 324–325, 1924. View at Google Scholar
  2. H. Nyquist, “Certain topics in telegraph transmission theory,” Transactions of the American Institute of Electrical Engineers, vol. 47, no. 2, pp. 617–644, 1928. View at Publisher · View at Google Scholar
  3. R. T. V. Hartley, “Transmission of information,” Bell System Technical Journal, vol. 7, no. 3, pp. 535–563, 1928. View at Publisher · View at Google Scholar
  4. C. E. Shannon, “A mathematical theory of communication,” The Bell System Technical Journal, vol. 27, no. 3, pp. 379–656, 1948. View at Publisher · View at Google Scholar · View at MathSciNet
  5. N. Wiener, Cybernetics, MIT Press, John Wiley & Sons, New York, NY, USA, 1948.
  6. A. Rényi, “On measures of entropy and information,” in Proceedings of the 4th Berkeley Symposium on Mathematical Statistics and Probability, pp. 547–561, University of California Press, 1961. View at Google Scholar · View at MathSciNet
  7. L. A. Zadeh, “Fuzzy sets,” Information and Control, vol. 8, pp. 338–353, 1965. View at Google Scholar
  8. A. De Luca and S. Termini, “A definition of a non-probabilistic entropy in setting of fuzzy sets,” Information and Control, vol. 20, no. 4, pp. 301–312, 1972. View at Publisher · View at Google Scholar · View at Scopus
  9. L. L. Campbell, “A coding theorem and renyi’s entropy,” Information and Control, vol. 8, pp. 423–429, 1965. View at Google Scholar
  10. S. Guiasu and C.-F. Picard, “Borne in ferictur de la longuerur utile de certains codes,” Comptes Rendus Mathematique Academic des Sciences, vol. 273, pp. 248–251, 1971. View at Google Scholar · View at MathSciNet
  11. M. Belis and S. Guiasu, “A quantitative-qualitative measure of information in cybernetic systems,” IEEE Transactions on Information Theory, vol. 14, no. 4, pp. 593–594, 1968. View at Publisher · View at Google Scholar
  12. G. Longo, Quantitative-Qualitative Measures of Information, Springer, New York, NY, USA, 1972. View at MathSciNet
  13. F. Gurdial and F. Pesson, “On useful information of order α,” Journal of Combanatrics, Information & System Sciences, vol. 2, no. 4, pp. 158–162, 1977. View at Google Scholar · View at MathSciNet
  14. M. Baig and M. J. Dar, “Some coding theorems on fuzzy entropy function depending upon parameter R and Ѵ,” IOSR Journal of Mathematics, vol. 9, no. 6, pp. 119–123, 2014. View at Publisher · View at Google Scholar
  15. M. A. K. Baig and M. J. Dar, “Fuzzy coding theorem on generalized fuzzy cost measure,” Asian Journal of Fuzzy and Applied Mathematics, vol. 2, pp. 28–34, 2014. View at Google Scholar
  16. A. Choudhary and S. Kumar, “Some more noiseless coding theorem on generalized R-Norm entropy,” Journal of Mathematics Research, vol. 3, no. 1, pp. 125–130, 2011. View at Publisher · View at Google Scholar
  17. A. Choudhary and S. Kumar, “Some coding theorems on generalized Havrda-Charvat and tsalli's entropy,” Tamkang Journal of Mathematics, vol. 43, no. 3, pp. 437–444, 2012. View at Google Scholar
  18. H. C. Taneja and P. K. Bhatia, “On a generalized mean codeword length for the best 1:1 code,” Soochow Journal of Mathematics, vol. 16, pp. 1–6, 1990. View at Google Scholar
  19. O. Parkash and P. K. Sharma, “Noiseless coding theorems corresponding to fuzzy entropies,” Southeast Asian Bulletin of Mathematics, vol. 27, no. 6, pp. 1073–1080, 2004. View at Publisher · View at Google Scholar · View at MathSciNet
  20. O. Parkash and P. K. Sharma, “A new class of fuzzy coding theorems,” Caribbean Journal of Mathematical and Computing Sciences, vol. 12, pp. 1–10, 2002. View at Google Scholar · View at MathSciNet
  21. O. Parkash, “A new parametric measure of fuzzy entropy,” in Proceedings of the 7th International Conference on Information Processing and Management of Uncertainty in Knowledge-Based Systems (IPMU '98), pp. 1732–1737, Paris, France, July 1998.
  22. P. Gupta, Niranjan, and H. Arora, “On best 1:1 codes for generalized quantitativequalitative measure of inaccuracy,” African Journal of Mathematics and Computer Science Research, vol. 4, no. 3, pp. 159–163, 2011. View at Google Scholar
  23. P. Jain and R. K. Tuteja, “On a coding theorem connected with useful entropy of order α and type β,” Kybernetika, vol. 23, no. 5, pp. 420–427, 1987. View at Google Scholar · View at MathSciNet · View at Scopus
  24. R. K. Tuli, “Mean codeword lengths and their correspondence with entropy measures,” International Journal of Engineering and Natural Sciences, vol. 4, pp. 175–180, 2010. View at Google Scholar
  25. R. K. Tuli and C. S. Sharma, “Applications of measures of fuzzy entropy to coding theory,” American Journal of Mathematics and Sciences, vol. 1, pp. 119–124, 2012. View at Google Scholar