Abstract

In coding theory, we study various properties of codes for application in data compression, cryptography, error correction, and network coding. The study of codes is introduced in Information Theory, electrical engineering, mathematics, and computer sciences for the transmission of data through reliable and efficient methods. We have to consider how coding of messages can be done efficiently so that the maximum number of messages can be sent over a noiseless channel in a given time. Thus, the minimum value of mean codeword length subject to a given constraint on codeword lengths has to be founded. In this paper, we have introduced mean codeword length of order and type for 1:1 codes and analyzed the relationship between average codeword length and fuzzy information measures for binary 1:1 codes. Further, noiseless coding theorem associated with fuzzy information measure has been established.

1. Introduction

In the 1940s, a new branch of mathematics was introduced as Information Theory. Information Theory considered the problems of how to process information, how to store information, how to retrieve information, and hence decision-making. Information Theory deals with the study that how information can be transmitted over communication channels. In 1924 and 1928, Nyquist [1, 2] first studied the information measures and Hartley [3] studied that information measures were logarithmic in nature. New properties of information sources and communication channels were published by Shannon [4] in his research paper “A Mathematical Theory of Communication.” Wiener [5] also started considering results similar to Shannon. A large number of applications of Information Theory in various fields of social, physical, and biological sciences have been developed, for example, economics, statistics, accounting, language, psychology, ecology, pattern recognition, computer sciences, and fuzzy sets. Later, Rényi [6] introduced entropy of order , which is the limiting case of Shannon entropy.

Fuzzy set theory was introduced by Zadeh [7]. It relates to the uncertainty occurring in the human cognitive processes. Fuzzy set theory has found applications in various engineering, biological sciences, and business. Zadeh introduced fuzzy entropy, a measure of fuzzy information based on Shannon’s entropy.

Let be a fuzzy set in the universe of discourse . We define membership function as , where is a degree of membership for each .

The following properties were given by De Luca and Termini [8] which are to be satisfied by fuzzy entropy.(1)Fuzzy entropy is the minimum iff set is crisp.(2)Fuzzy entropy is the maximum when membership value is 0.5.(3)Fuzzy entropy decreases if set is sharpened.(4)Fuzzy entropy of a set is the same as its complement.In coding theory, we study various properties of codes for application in data compression, cryptography, error correction, and network coding. The study of codes is introduced in Information Theory, electrical engineering, mathematics, and computer sciences for the transmission of data through reliable and efficient methods. We have to consider how coding of messages can be done efficiently so that the maximum number of messages can be sent over a noiseless channel in a given time. Thus, the minimum value of mean codeword length subject to a given constraint on codeword lengths has to be founded. Let be random variable with each value , , that generates messages represented by the set which need to be transmitted. This set is called code alphabet and sequence assigned to each is called codeword. The codeword length denoted by associated with satisfying Kraft’s inequality is given bywhere is the size of alphabet. Further, codes are chosen to minimize average codeword length which is given bywhere is the probability of the occurrence of . Now, Shannon’s [4] noiseless coding theorem for uniquely decipherable codes is given byfinding the lower bounds on in terms of Shannon’s entropy .

Campbell [9] proposed a special exponentiated mean codeword length of order for uniquely decipherable codes given byand proved a noiseless coding theorem. Campbell analyzed that its lower bound lies between and , where ; , .

The above is Renyi’s measure of entropy of order . As , it is easily shown that and approaches .

For uniquely decipherable code, a weighted average length was given by Guiasu and Picard [10]:This is defined as the average cost of transmitting letters with probability and utility .

A quantitative-qualitative measure of entropy was defined by Belis and Guiasu [11] for which Longo [12] gave the lower bound for useful mean codeword length. Further, a noiseless coding theorem was verified by Guiasu and Picard by introducing lower bound for useful mean codeword length. And Gurdial and Pessoa [13] proved noiseless coding theorem by giving lower bounds for useful mean codeword lengths of order in terms of useful measures of information of order . The following information measure was introduced by Belis and Guiasu:Similarly, quantitative-qualitative information measure was given by Taneja and Tuteja:Further, Bhaker and Hooda gave the following information measures:Now, Baig and Dar [14, 15] proposed a new considered fuzzy information measure of order and type :And for this information measure they introduced the given mean code length of order and type :Choudhary and Kumar [16] proved some noiseless coding theorem on generalized R-Norm entropy. Also, Choudhary and Kumar [17] proposed some coding theorems on generalized Havrda-Charvat and Tsallis entropy. Baig and Dar [14, 15] introduced few coding theorems on fuzzy entropy function depending upon parameters R and V and gave fuzzy coding theorem on generalized fuzzy cost measure. Taneja and Bhatia [18] proposed a generalized mean codeword length for the best 1:1 code and Parkash and Sharma [19, 20] proved some noiseless coding theorems corresponding to fuzzy entropies and introduced a new class of fuzzy coding theorems. Parkash [21] introduced a new parametric measure of fuzzy entropy. Gupta et al. [22] proposed 1:1 codes for generalized quantitative-qualitative measure of inaccuracy. Jain and Tuteja [23] introduced a coding theorem connected with useful entropy of order and type . Tuli [24] introduced mean codeword lengths and their correspondence with entropy measures. Tuli and Sharma [25] proved some new coding theorems and consequently developed some new weighted fuzzy mean codeword lengths corresponding to the well-known measures of weighted fuzzy entropy.

In the next section, we will prove the fuzzy noiseless theorem for 1:1 codes of binary size and hence prove that they are less constrained.

2. Fuzzy Noiseless Coding Theorem for 1:1 Codes

Theorem 1. The mean codeword length order and type for 1:1 binary code satisfy:where

Proof. Consider the following fuzzy information measure:And hence corresponding to the above fuzzy information measure we introduce the given mean codeword length for 1:1 code as follows: Using, Holder’s inequality,For all, , ; , , or , and .
Let , ,Then, (15) becomes Now, we use the following condition: Thus, the equation is reduced as follows: Now, we raise the power on both sides by , and we get Hence, we multiply both sides by , and we get That is, .
Hence, ; this proves the result.

Theorem 2. The mean codeword length order and type for 1:1 binary code satisfy the following:

Proof. We haveThat is,or or Hence, we multiply both sides by .
And summing over , we get or or Since for and after suitable operations, we getor we can write , which proves the result.

3. Conclusion

In modern data transmission and storage systems, the key ingredients that help in achieving the high degree of reliability are the error correcting codes. The noiseless channel faces the problem of efficient coding of messages and hence we have to maximize the number of messages to be sent through a channel in a given time. Thus, we have introduced mean codeword length of order and type for 1:1 codes and analyzed the relationship between average codeword length and fuzzy information measures for binary 1:1 codes. Further, noiseless coding theorems associated with fuzzy information measure have been established.

4. Future Research Endeavors

Decision-making problems in general utilize information from the knowledge base of experts in view of their perception about that system. Besides mathematical studies, algorithms for application in decision-making can also be discussed in extension of the above research. Comparison of other fuzzy information measures in the light of fuzzy noiseless theorem and 1:1 codes of binary size can also be proved and compared. Further, this fuzzy noiseless coding theorem can be utilized for characterizing and applying R-Norm entropy measures.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.