comparative study of generalized quantitative-qualitative
TRANSCRIPT
Research ArticleComparative Study of GeneralizedQuantitative-Qualitative Inaccuracy Fuzzy Measures forNoiseless Coding Theorem and 1:1 Codes
H. D. Arora1 and Anjali Dhiman2
1Department of Mathematics, Amity Institute of Applied Sciences, Amity University, Noida 201301, India2Department of Mathematics, Ideal Institute of Management and Technology, GGSIP University, Delhi 110092, India
Correspondence should be addressed to Anjali Dhiman; [email protected]
Received 3 March 2015; Accepted 1 April 2015
Academic Editor: Ram U. Verma
Copyright © 2015 H. D. Arora and A. Dhiman.This is an open access article distributed under the Creative Commons AttributionLicense, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properlycited.
In coding theory, we study various properties of codes for application in data compression, cryptography, error correction, andnetwork coding. The study of codes is introduced in Information Theory, electrical engineering, mathematics, and computersciences for the transmission of data through reliable and efficient methods. We have to consider how coding of messages canbe done efficiently so that the maximum number of messages can be sent over a noiseless channel in a given time. Thus, theminimum value of mean codeword length subject to a given constraint on codeword lengths has to be founded. In this paper,we have introduced mean codeword length of order 𝛼 and type 𝛽 for 1:1 codes and analyzed the relationship between averagecodeword length and fuzzy information measures for binary 1:1 codes. Further, noiseless coding theorem associated with fuzzyinformation measure has been established.
1. Introduction
In the 1940s, a new branch of mathematics was introducedas Information Theory. Information Theory considered theproblems of how to process information, how to store infor-mation, how to retrieve information, and hence decision-making. Information Theory deals with the study that howinformation can be transmitted over communication chan-nels. In 1924 and 1928, Nyquist [1, 2] first studied the infor-mation measures and Hartley [3] studied that informationmeasureswere logarithmic in nature.Newproperties of infor-mation sources and communication channels were publishedby Shannon [4] in his research paper “AMathematicalTheoryof Communication.” Wiener [5] also started consideringresults similar to Shannon. A large number of applications ofInformation Theory in various fields of social, physical, andbiological sciences have been developed, for example, eco-nomics, statistics, accounting, language, psychology, ecology,pattern recognition, computer sciences, and fuzzy sets. Later,Renyi [6] introduced entropy of order 𝑟, which is the limitingcase of Shannon entropy.
Fuzzy set theory was introduced by Zadeh [7]. It relates tothe uncertainty occurring in the human cognitive processes.Fuzzy set theory has found applications in various engi-neering, biological sciences, and business. Zadeh introducedfuzzy entropy, a measure of fuzzy information based onShannon’s entropy.
Let 𝐴 be a fuzzy set in the universe of discourse Ω. Wedefine membership function as 𝜇
𝐴: Ω → [0, 1], where
𝜇𝐴(𝑥) is a degree of membership for each 𝑥 ∈ Ω.The following properties were given by De Luca and
Termini [8] which are to be satisfied by fuzzy entropy.(1) Fuzzy entropy is the minimum iff set is crisp.(2) Fuzzy entropy is the maximum when membership
value is 0.5.(3) Fuzzy entropy decreases if set is sharpened.(4) Fuzzy entropy of a set is the same as its complement.
In coding theory, we study various properties of codes forapplication in data compression, cryptography, error correc-tion, and network coding. The study of codes is introduced
Hindawi Publishing CorporationInternational Journal of Mathematics and Mathematical SciencesVolume 2015, Article ID 258675, 6 pageshttp://dx.doi.org/10.1155/2015/258675
2 International Journal of Mathematics and Mathematical Sciences
in Information Theory, electrical engineering, mathematics,and computer sciences for the transmission of data throughreliable and efficient methods. We have to consider how cod-ing of messages can be done efficiently so that the maximumnumber of messages can be sent over a noiseless channel ina given time. Thus, the minimum value of mean codewordlength subject to a given constraint on codeword lengths hasto be founded. Let 𝑋 be random variable with each value𝑥𝑖, 𝑖 = 1, 2, . . . , 𝑛, that generates messages represented by
the set {𝑎1, 𝑎2, . . . , 𝑎
𝐷} which need to be transmitted. This
set is called code alphabet and sequence assigned to each𝑥𝑖is called codeword. The codeword length denoted by 𝑛
𝑖
associated with 𝑥𝑖satisfying Kraft’s inequality is given by
𝐷−𝑙1+𝐷−𝑙2+ ⋅ ⋅ ⋅ +𝐷
−𝑙𝑛
≤ 1, (1)
where 𝐷 is the size of alphabet. Further, codes are chosen tominimize average codeword length which is given by
𝐿 =
𝑛
∑
𝑖=!
𝑛𝑖𝑝𝑖, (2)
where𝑝𝑖is the probability of the occurrence of 𝑥
𝑖. Now, Shan-
non’s [4] noiseless coding theorem for uniquely decipherablecodes is given by
𝐻(𝑃)
log𝐷≤ 𝐿 <
𝐻 (𝑃)
log𝐷+ 1 (3)
finding the lower bounds on 𝐿 in terms of Shannon’s entropy𝐻(𝑃).
Campbell [9] proposed a special exponentiated meancodeword length of order 𝛼 for uniquely decipherable codesgiven by
𝐿𝛼=
𝛼
1 − 𝛼log𝐷[
𝑛
∑
𝑖=1𝑝𝑖𝐷(1−𝛼)𝑛
𝑖/𝛼] (4)
and proved a noiseless coding theorem. Campbell analyzedthat its lower bound lies between 𝑅
𝛼(𝑃) and 𝑅
𝛼(𝑃)+ 1, where
𝑅𝛼(𝑃) = (1 − 𝛼)−1log
𝐷[∑𝑛
𝑖=1 𝑝𝛼
𝑖]; 𝛼 > 0, 𝛼 = 1.
The above is Renyi’s measure of entropy of order 𝛼. As𝛼 → 1, it is easily shown that 𝐿
𝛼→ 𝐿 and𝑅
𝛼(𝑃) approaches
𝐻(𝑃).For uniquely decipherable code, a weighted average
length was given by Guiasu and Picard [10]:
𝐿 =
𝑛
∑
𝑖=1
𝑢𝑖𝑛𝑖𝑝𝑖
∑𝑛
𝑖=1 𝑢𝑖𝑝𝑖. (5)
This is defined as the average cost of transmitting letters 𝑥𝑖
with probability 𝑝𝑖and utility 𝑢
𝑖.
A quantitative-qualitativemeasure of entropywas definedby Belis and Guiasu [11] for which Longo [12] gave the lowerbound for useful mean codeword length. Further, a noiselesscoding theorem was verified by Guiasu and Picard by intro-ducing lower bound for useful mean codeword length. AndGurdial and Pessoa [13] proved noiseless coding theorem bygiving lower bounds for useful mean codeword lengths of
order 𝛼 in terms of usefulmeasures of information of order 𝛼.The following information measure was introduced by Belisand Guiasu:
𝐻(𝑃,𝑈) = −
𝑛
∑
𝑖=1𝑢𝑖𝑝𝑖log𝑝𝑖. (6)
Similarly, quantitative-qualitative information measure wasgiven by Taneja and Tuteja:
𝐻(
𝑃
𝑄
,𝑈) =
𝑛
∑
𝑖=1𝑝𝑖𝑢𝑖log(
𝑝𝑖
𝑞𝑖
) . (7)
Further, Bhaker and Hooda gave the following informationmeasures:
𝐻(
𝑃
𝑄
,𝑈) =
∑𝑛
𝑖=1𝑢𝑖𝑝𝑖log (𝑝
𝑖/𝑞𝑖)
∑𝑛
𝑖=1𝑢𝑖𝑝𝑖
,
𝐻𝛼(
𝑃
𝑄
,𝑈) =
11 − 𝛼
∑𝑛
𝑖=1𝑢𝑖𝑝𝛼
𝑖𝑞1−𝛼𝑖
∑𝑛
𝑖=1 𝑢𝑖𝑝𝑖, 𝛼 = 1.
(8)
Now, Baig and Dar [14, 15] proposed a new considered fuzzyinformation measure of order 𝛼 and type 𝛽:
𝐻𝛽
𝛼(𝐴; 𝑈)
=
121−𝛼 − 1
[
[
∑𝑛
𝑖=1 𝑢𝑖𝛽{𝜇𝛼+𝛽−1𝐴
(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛼+𝛽−1
}
∑𝑛
𝑖=1 𝑢𝛽
𝑖{𝜇𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))}
𝛽
− 1]
]
.
(9)
And for this information measure they introduced the givenmean code length of order 𝛼 and type 𝛽:
𝐿𝛽
𝛼(𝑈)
=
121−𝛼 − 1
[
[
[
{
{
{
∑𝑛
𝑖=1 𝑢𝑖𝛽{𝜇𝛽
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽}𝐷((1−𝛼)/𝛼)𝑙
𝑖
∑𝑛
𝑖=1 𝑢𝛽
𝑖{𝜇𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))}
𝛽
}
}
}
𝛼
− 1]]
]
.
(10)
Choudhary and Kumar [16] proved some noiseless codingtheorem on generalized R-Norm entropy. Also, Choudharyand Kumar [17] proposed some coding theorems on gen-eralized Havrda-Charvat and Tsallis entropy. Baig and Dar[14, 15] introduced few coding theorems on fuzzy entropyfunction depending upon parameters R andV and gave fuzzycoding theorem on generalized fuzzy cost measure. Tanejaand Bhatia [18] proposed a generalized mean codewordlength for the best 1:1 code and Parkash and Sharma [19, 20]proved some noiseless coding theorems corresponding tofuzzy entropies and introduced a new class of fuzzy codingtheorems. Parkash [21] introduced a new parametricmeasureof fuzzy entropy. Gupta et al. [22] proposed 1:1 codes forgeneralized quantitative-qualitative measure of inaccuracy.
International Journal of Mathematics and Mathematical Sciences 3
Jain and Tuteja [23] introduced a coding theorem connectedwith useful entropy of order 𝛼 and type 𝛽. Tuli [24] intro-duced mean codeword lengths and their correspondencewith entropy measures. Tuli and Sharma [25] proved somenew coding theorems and consequently developed some newweighted fuzzy mean codeword lengths corresponding to thewell-known measures of weighted fuzzy entropy.
In the next section, we will prove the fuzzy noiselesstheorem for 1:1 codes of binary size and hence prove that theyare less constrained.
2. Fuzzy Noiseless Coding Theorem for1:1 Codes
Theorem 1. Themean codeword length order 𝛼 and type 𝛽 for1:1 binary code satisfy:
𝐻𝛽
𝛼(𝐴; 𝑈) ≤ 𝐿 (1 :1)𝛽
𝛼(𝑈) , (11)
where
𝐻𝛽
𝛼(𝐴; 𝑈) =
121−𝛼 − 1
[
[
∑𝑛
𝑖=1 𝑢𝑖𝛽{𝜇𝛼+𝛽−1𝐴
(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛼+𝛽−1
}
∑𝑛
𝑖=1 𝑢𝛽
𝑖{𝜇𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))}
𝛽− 1]
]
,
𝐿 (1 :1)𝛽𝛼(𝑈) =
121−𝛼 − 1
[
[
[
{
{
{
∑𝑛
𝑖=1 𝑢𝑖𝛽{𝜇𝛽
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽} 2((1−𝛼)/𝛼) log((𝑖+2)/2)
∑𝑛
𝑖=1 𝑢𝛽
𝑖{𝜇𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))}
𝛽
}
}
}
𝛼
− 1]]
]
.
(12)
Proof. Consider the following fuzzy information measure:𝐻𝛽
𝛼(𝐴; 𝑈)
=
121−𝛼 − 1
[
[
∑𝑛
𝑖=1 𝑢𝑖𝛽{𝜇𝛼+𝛽−1𝐴
(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛼+𝛽−1
}
∑𝑛
𝑖=1 𝑢𝛽
𝑖{𝜇𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))}
𝛽
− 1]
]
.
(13)
And hence corresponding to the above fuzzy informationmeasure we introduce the given mean codeword length for1:1 code as follows:
𝐿 (1 :1)𝛽𝛼(𝑈) =
121−𝛼 − 1
[
[
[
{
{
{
∑𝑛
𝑖=1 𝑢𝑖𝛽{𝜇𝛽
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽} 2((1−𝛼)/𝛼) log((𝑖+2)/2)
∑𝑛
𝑖=1 𝑢𝛽
𝑖{𝜇𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))}
𝛽
}
}
}
𝛼
− 1]]
]
. (14)
Using, Holder’s inequality,
𝑛
∑
𝑖=1𝑥𝑖𝑦𝑖≥ (
𝑛
∑
𝑖=1𝑥𝑝
𝑖)
1/𝑝
(
𝑛
∑
𝑖=1𝑦𝑞
𝑖)
1/𝑞
. (15)
For all, 𝑥𝑖, 𝑦𝑖> 0, 𝑖 = 1, 2, . . . , 𝑛; 1/𝑝 + 1/𝑞 = 1, 𝑝 < 1( = 0),
𝑞 < 0 or 𝑞 < 1( = 0), and 𝑝 < 0.Let 𝑝 = (𝛼 − 1)/𝛼, 𝑞 = 1 − 𝛼,
𝑥𝑖=
(𝑢𝑖{𝜇𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))})𝛼𝛽/(𝛼−1) 2−log((𝑖+2)/2)
∑𝑛
𝑖=1 (𝑢𝑖 {𝜇𝐴 (𝑥𝑖) + (1 − 𝜇𝐴 (𝑥𝑖))})𝛼𝛽/(𝛼−1) ,
𝑦𝑖
=
𝑢𝛽/(1−𝛼)𝑖
(𝜇(𝛼+𝛽−1)/(1−𝛼)𝐴
(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))(𝛼+𝛽−1)/(1−𝛼)
)
∑𝑛
𝑖=1 (𝑢𝑖 {𝜇𝐴 (𝑥𝑖) + (1 − 𝜇𝐴 (𝑥𝑖))})𝛽/(1−𝛼) .
(16)
Then, (15) becomes
∑𝑛
𝑖=1 𝑢𝛽
𝑖{𝜇𝛽−1𝐴
(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽−1} 2−log((𝑖+2)/2)
∑𝑛
𝑖=1 𝑢𝛽
𝑖{𝜇𝛽−1𝐴
(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽−1}
≥[
[
∑𝑛
𝑖=1 𝑢𝛽
𝑖{𝜇𝛽
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽} 2((1−𝛼)/𝛼) log((𝑖+2)/2)
∑𝑛
𝑖=1 𝑢𝛽
𝑖{𝜇𝛽
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽}
]
]
𝛼/(𝛼−1)
⋅[
[
∑𝑛
𝑖=1 𝑢𝛽
𝑖{𝜇𝛼+𝛽−1𝐴
(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛼+𝛽−1
}
∑𝑛
𝑖=1 𝑢𝛽
𝑖{𝜇𝛽
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽}
]
]
1/(1−𝛼)
.
(17)
4 International Journal of Mathematics and Mathematical Sciences
Now, we use the following condition:
𝑛
∑
𝑖=1𝑢𝛽
𝑖{𝜇𝛽−1𝐴
(𝑥𝑖) + (1−𝜇
𝐴(𝑥𝑖))𝛽−1} ⋅𝐷−𝑙𝑖
≤
𝑛
∑
𝑖=1𝑢𝛽
𝑖{𝜇𝛽
𝐴(𝑥𝑖) + (1−𝜇
𝐴(𝑥𝑖))𝛽} .
(18)
Thus, the equation is reduced as follows:
[
[
∑𝑛
𝑖=1 𝑢𝛽
𝑖{𝜇𝛽
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽} 2((1−𝛼)/𝛼) log((𝑖+2)/2)
∑𝑛
𝑖=1 𝑢𝛽
𝑖{𝜇𝛽
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽}
]
]
𝛼/(1−𝛼)
≥[
[
∑𝑛
𝑖=1 𝑢𝛽
𝑖{𝜇𝛼+𝛽−1𝐴
(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛼+𝛽−1
}
∑𝑛
𝑖=1 𝑢𝛽
𝑖{𝜇𝛽
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽}
]
]
1/(1−𝛼)
.
(19)
Now, we raise the power on both sides by (1 − 𝛼), and we get
[
[
∑𝑛
𝑖=1 𝑢𝛽
𝑖{𝜇𝛽
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽} 2((1−𝛼)/𝛼) log((𝑖+2)/2)
∑𝑛
𝑖=1 𝑢𝛽
𝑖{𝜇𝛽
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽}
]
]
𝛼
− 1 ≥∑𝑛
𝑖=1 𝑢𝛽
𝑖{𝜇𝛼+𝛽−1𝐴
(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛼+𝛽−1
}
∑𝑛
𝑖=1 𝑢𝛽
𝑖{𝜇𝛽
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽}
− 1.
(20)
Hence, we multiply both sides by 1/(21−𝛼 −1) > 0, and we get
121−𝛼 − 1
[
[
[
{
{
{
∑𝑛
𝑖=1 𝑢𝑖𝛽{𝜇𝛽
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽} 2((1−𝛼)/𝛼) log((𝑖+2)/2)
∑𝑛
𝑖=1 𝑢𝛽
𝑖{𝜇𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))}
𝛽
}
}
}
𝛼
− 1]]
]
≥
121−𝛼 − 1
[
[
∑𝑛
𝑖=1 𝑢𝑖𝛽{𝜇𝛼+𝛽−1𝐴
(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛼+𝛽−1
}
∑𝑛
𝑖=1 𝑢𝛽
𝑖{𝜇𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))}
𝛽− 1]
]
.
(21)
That is, 𝐿(1 :1)𝛽𝛼(𝑈) ≥ 𝐻
𝛽
𝛼(𝐴; 𝑈).
Hence, 𝐻𝛽𝛼(𝐴; 𝑈) ≤ 𝐿(1 :1)𝛽
𝛼(𝑈); this proves the
result.
Theorem 2. Themean codeword length order 𝛼 and type 𝛽 for1:1 binary code satisfy the following:
𝐿 (1 :1)𝛽𝛼(𝑈) < 𝐻
𝛽
𝛼(𝐴; 𝑈) 21−𝛼 + 1. (22)
Proof. We have
2− log((𝑖+2)/2) ≤𝜇𝛼
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛼
(∑𝑛
𝑖=1 𝑢𝛽
𝑖(𝜇𝛼+𝛽−1𝐴
(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛼+𝛽−1
)) / (∑𝑛
𝑖=1 𝑢𝛽
𝑖(𝜇𝛽
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽))
. (23)
That is,
log(𝑖 + 22
) ≤ log(𝜇𝛼
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛼
(∑𝑛
𝑖=1 𝑢𝛽
𝑖(𝜇𝛼+𝛽−1𝐴
(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛼+𝛽−1
)) / (∑𝑛
𝑖=1 𝑢𝛽
𝑖(𝜇𝛽
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽))
)
−1
⋅ 2 (24)
International Journal of Mathematics and Mathematical Sciences 5
or
2log((𝑖+2)/2) < (𝜇𝛼
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛼
(∑𝑛
𝑖=1 𝑢𝛽
𝑖(𝜇𝛼+𝛽−1𝐴
(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛼+𝛽−1
)) / (∑𝑛
𝑖=1 𝑢𝛽
𝑖(𝜇𝛽
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽))
)
−1
⋅ 2 (25)
or
2((1−𝛼)/𝛼) log((𝑖+2)/2) < (𝜇𝛼
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛼
(∑𝑛
𝑖=1 𝑢𝛽
𝑖(𝜇𝛼+𝛽−1𝐴
(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛼+𝛽−1
)) / (∑𝑛
𝑖=1 𝑢𝛽
𝑖(𝜇𝛽
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽))
)
(𝛼−1)/𝛼
⋅ 2(1−𝛼)/𝛼.
(26)
Hence, we multiply both sides by (𝑢𝛽
𝑖(𝜇𝛽𝐴(𝑥𝑖) + (1 −
𝜇𝐴(𝑥𝑖))𝛽))/(∑𝑛
𝑖=1 𝑢𝛽
𝑖(𝜇𝛽𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽)).
And summing over 𝑖 = 1, 2, . . . , 𝑛, we get
∑𝑛
𝑖=1 𝑢𝛽
𝑖(𝜇𝛽
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽 2((1−𝛼)/𝛼) log((𝑖+2)/2))
∑𝑛
𝑖=1 𝑢𝛽
𝑖(𝜇𝛽
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽)
< (
∑𝑛
𝑖=1 𝑢𝛽
𝑖(𝜇𝛼+𝛽−1𝐴
(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛼+𝛽−1
)
∑𝑛
𝑖=1 𝑢𝛽
𝑖(𝜇𝛽
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽)
)
1/𝛼
⋅ 2(1−𝛼)/𝛼
(27)
or
(
∑𝑛
𝑖=1 𝑢𝛽
𝑖(𝜇𝛽
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽 2((1−𝛼)/𝛼) log((𝑖+2)/2))
∑𝑛
𝑖=1 𝑢𝛽
𝑖(𝜇𝛽
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽)
)
𝛼
< (
∑𝑛
𝑖=1 𝑢𝛽
𝑖(𝜇𝛼+𝛽−1𝐴
(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛼+𝛽−1
)
∑𝑛
𝑖=1 𝑢𝛽
𝑖(𝜇𝛽
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽)
) ⋅ 21−𝛼
(28)
or
(
∑𝑛
𝑖=1 𝑢𝛽
𝑖(𝜇𝛽
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽 2((1−𝛼)/𝛼) log((𝑖+2)/2))
∑𝑛
𝑖=1 𝑢𝛽
𝑖(𝜇𝛽
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽)
)
𝛼
− 1
< (
∑𝑛
𝑖=1 𝑢𝛽
𝑖(𝜇𝛼+𝛽−1𝐴
(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛼+𝛽−1
)
∑𝑛
𝑖=1 𝑢𝛽
𝑖(𝜇𝛽
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽)
− 1)
⋅ 21−𝛼 + 21−𝛼 − 1.
(29)
Since 21−𝛼 − 1 > 0 for 0 < 𝛼 < 1 and after suitable operations,we get
121−𝛼 − 1
[
[
[
(
∑𝑛
𝑖=1 𝑢𝛽
𝑖(𝜇𝛽
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽 2((1−𝛼)/𝛼) log((𝑖+2)/2))
∑𝑛
𝑖=1 𝑢𝛽
𝑖(𝜇𝛽
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽)
)
𝛼
− 1]]
]
<
121−𝛼 − 1
[
[
∑𝑛
𝑖=1 𝑢𝛽
𝑖(𝜇𝛼+𝛽−1𝐴
(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛼+𝛽−1
)
∑𝑛
𝑖=1 𝑢𝛽
𝑖(𝜇𝛽
𝐴(𝑥𝑖) + (1 − 𝜇
𝐴(𝑥𝑖))𝛽)
− 1]
]
⋅ 21−𝛼 + 21−𝛼 − 121−𝛼 − 1
(30)
6 International Journal of Mathematics and Mathematical Sciences
or we can write 𝐿(1 : 1)𝛽𝛼(𝑈) < 𝐻
𝛽
𝛼(𝐴; 𝑈)21−𝛼 + 1, which
proves the result.
3. Conclusion
In modern data transmission and storage systems, the keyingredients that help in achieving the high degree of reliabilityare the error correcting codes. The noiseless channel facesthe problem of efficient coding of messages and hence wehave to maximize the number of messages to be sent througha channel in a given time. Thus, we have introduced meancodeword length of order 𝛼 and type 𝛽 for 1:1 codes andanalyzed the relationship between average codeword lengthand fuzzy informationmeasures for binary 1:1 codes. Further,noiseless coding theorems associated with fuzzy informationmeasure have been established.
4. Future Research Endeavors
Decision-making problems in general utilize informationfrom the knowledge base of experts in view of their per-ception about that system. Besides mathematical studies,algorithms for application in decision-making can also bediscussed in extension of the above research. Comparisonof other fuzzy information measures in the light of fuzzynoiseless theorem and 1:1 codes of binary size can also beproved and compared. Further, this fuzzy noiseless codingtheorem can be utilized for characterizing and applying R-Norm entropy measures.
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper.
References
[1] H. Nyquist, “Certain factors affecting telegraph speed,” BellSystem Technical Journal, pp. 324–325, 1924.
[2] H. Nyquist, “Certain topics in telegraph transmission theory,”Transactions of theAmerican Institute of Electrical Engineers, vol.47, no. 2, pp. 617–644, 1928.
[3] R. T. V. Hartley, “Transmission of information,” Bell SystemTechnical Journal, vol. 7, no. 3, pp. 535–563, 1928.
[4] C. E. Shannon, “Amathematical theory of communication,”TheBell System Technical Journal, vol. 27, no. 3, pp. 379–656, 1948.
[5] N. Wiener, Cybernetics, MIT Press, John Wiley & Sons, NewYork, NY, USA, 1948.
[6] A. Renyi, “On measures of entropy and information,” inProceedings of the 4th Berkeley Symposium on MathematicalStatistics and Probability, pp. 547–561, University of CaliforniaPress, 1961.
[7] L. A. Zadeh, “Fuzzy sets,” Information and Control, vol. 8, pp.338–353, 1965.
[8] A. De Luca and S. Termini, “A definition of a non-probabilisticentropy in setting of fuzzy sets,” Information and Control, vol.20, no. 4, pp. 301–312, 1972.
[9] L. L. Campbell, “A coding theorem and renyi’s entropy,” Infor-mation and Control, vol. 8, pp. 423–429, 1965.
[10] S. Guiasu and C.-F. Picard, “Borne in ferictur de la longuerurutile de certains codes,” Comptes Rendus Mathematique Aca-demic des Sciences, vol. 273, pp. 248–251, 1971.
[11] M. Belis and S. Guiasu, “A quantitative-qualitative measureof information in cybernetic systems,” IEEE Transactions onInformation Theory, vol. 14, no. 4, pp. 593–594, 1968.
[12] G. Longo, Quantitative-Qualitative Measures of Information,Springer, New York, NY, USA, 1972.
[13] F. Gurdial and F. Pesson, “On useful information of order 𝛼,”Journal of Combanatrics, Information & System Sciences, vol. 2,no. 4, pp. 158–162, 1977.
[14] M. Baig andM. J. Dar, “Some coding theorems on fuzzy entropyfunction depending upon parameter R andG,” IOSR Journal ofMathematics, vol. 9, no. 6, pp. 119–123, 2014.
[15] M. A. K. Baig and M. J. Dar, “Fuzzy coding theorem ongeneralized fuzzy cost measure,” Asian Journal of Fuzzy andApplied Mathematics, vol. 2, pp. 28–34, 2014.
[16] A. Choudhary and S. Kumar, “Somemore noiseless coding the-orem on generalized R-Norm entropy,” Journal of MathematicsResearch, vol. 3, no. 1, pp. 125–130, 2011.
[17] A. Choudhary and S. Kumar, “Some coding theorems on gen-eralized Havrda-Charvat and tsalli’s entropy,” Tamkang Journalof Mathematics, vol. 43, no. 3, pp. 437–444, 2012.
[18] H. C. Taneja and P. K. Bhatia, “On a generalizedmean codewordlength for the best 1:1 code,” Soochow Journal of Mathematics,vol. 16, pp. 1–6, 1990.
[19] O. Parkash and P. K. Sharma, “Noiseless coding theoremscorresponding to fuzzy entropies,” Southeast Asian Bulletin ofMathematics, vol. 27, no. 6, pp. 1073–1080, 2004.
[20] O. Parkash and P. K. Sharma, “A new class of fuzzy codingtheorems,” Caribbean Journal of Mathematical and ComputingSciences, vol. 12, pp. 1–10, 2002.
[21] O. Parkash, “A new parametric measure of fuzzy entropy,” inProceedings of the 7th International Conference on InformationProcessing and Management of Uncertainty in Knowledge-BasedSystems (IPMU ’98), pp. 1732–1737, Paris, France, July 1998.
[22] P. Gupta, Niranjan, and H. Arora, “On best 1:1 codes for gen-eralized quantitativequalitative measure of inaccuracy,” AfricanJournal of Mathematics and Computer Science Research, vol. 4,no. 3, pp. 159–163, 2011.
[23] P. Jain and R. K. Tuteja, “On a coding theorem connected withuseful entropy of order 𝛼 and type 𝛽,” Kybernetika, vol. 23, no.5, pp. 420–427, 1987.
[24] R. K. Tuli, “Mean codeword lengths and their correspondencewith entropy measures,” International Journal of Engineeringand Natural Sciences, vol. 4, pp. 175–180, 2010.
[25] R. K. Tuli and C. S. Sharma, “Applications of measures of fuzzyentropy to coding theory,”American Journal ofMathematics andSciences, vol. 1, pp. 119–124, 2012.
Submit your manuscripts athttp://www.hindawi.com
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
MathematicsJournal of
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
Mathematical Problems in Engineering
Hindawi Publishing Corporationhttp://www.hindawi.com
Differential EquationsInternational Journal of
Volume 2014
Applied MathematicsJournal of
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
Probability and StatisticsHindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
Journal of
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
Mathematical PhysicsAdvances in
Complex AnalysisJournal of
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
OptimizationJournal of
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
CombinatoricsHindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
International Journal of
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
Operations ResearchAdvances in
Journal of
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
Function Spaces
Abstract and Applied AnalysisHindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
International Journal of Mathematics and Mathematical Sciences
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
The Scientific World JournalHindawi Publishing Corporation http://www.hindawi.com Volume 2014
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
Algebra
Discrete Dynamics in Nature and Society
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
Decision SciencesAdvances in
Discrete MathematicsJournal of
Hindawi Publishing Corporationhttp://www.hindawi.com
Volume 2014 Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
Stochastic AnalysisInternational Journal of