research article generalized residual entropy and upper...

6
Research Article Generalized Residual Entropy and Upper Record Values Suchandan Kayal Department of Mathematics, National Institute of Technology Rourkela, Rourkela 769008, India Correspondence should be addressed to Suchandan Kayal; [email protected] Received 1 June 2015; Accepted 26 July 2015 Academic Editor: Ram´ on M. Rodr´ ıguez-Dagnino Copyright © 2015 Suchandan Kayal. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. In this communication, we deal with a generalized residual entropy of record values and weighted distributions. Some results on monotone behaviour of generalized residual entropy in record values are obtained. Upper and lower bounds are presented. Further, based on this measure, we study some comparison results between a random variable and its weighted version. Finally, we describe some estimation techniques to estimate the generalized residual entropy of a lifetime distribution. 1. Introduction ere have been several attempts made by various researchers to generalize the Shannon entropy (see Shannon [1]) since its appearance in Bell System Technical Journal. For various properties and applications of the generalized entropy mea- sures, we refer to Kapur [2], Renyi [3], Tsallis [4], and Varma [5]. In this paper, we consider generalized residual entropy due to Varma [5]. Let be a nonnegative random variable representing the lifetime of a system with an absolutely con- tinuous cumulative distribution function (), probability density function (), survival function ()(= 1 − ()), and hazard rate (). e generalized entropy of is given by (see Varma [5]) , () = ( 1 ) ln 0 +−1 () , 0 ≤− 1 < < . (1) Measure (1) reduces to Renyi entropy (see Renyi [3]) when = 1 and reduces to Shannon entropy (see Shannon [1]) when = 1 and 1. We oſten find some situations in practice where the measure defined by (1) is not an appropriate tool to deal with uncertainty. For example, in reliability and life testing studies, sometimes it is required to modify the current age of a system. Here, one may be interested to study the uncertainty of the random variable = [ − | ≥ ]. e random variable is dubbed as the residual lifetime of a system which has survived up to time 0 and is still working. Analogous to Ebrahimi [6], the generalized entropy of the residual lifetime is given by , (; ) = ( 1 ) ln +−1 () +−1 () , 0 ≤− 1 < < , ≥ 0, (2) which is also known as the generalized residual entropy. It reduces to (1) when = 0. Also (2) reduces to Renyi’s residual entropy (see Asadi et al. [7]) when = 1 and reduces to residual entropy (see Ebrahimi [6]) when = 1 and 1. Based on the generalized entropy measures given in (1) and (2), several authors obtained various results in the literature. In this direction, we refer to Kayal [811], Kayal and Vellaisamy [12], Kumar and Taneja [13], and Sati and Gupta [14]. In this paper, we study some properties and characterizations of the generalized residual entropy given by (2) based on the upper record values. Let { : = 1, 2,...} be a sequence of identically and independently distributed nonnegative random variables having an absolutely continuous cumulative distribution function (), probability density function (), and survival function (). An observation in an infinite sequence 1 , 2 ,... is said to be an upper record value if its value is greater than that of all the previous observations. For convenience, we denote 1 = 1 and, for 2, = min{; 1 < and > 1 }. en, 1 , 2 ,... is called a sequence of upper record values. e probability Hindawi Publishing Corporation Journal of Probability and Statistics Volume 2015, Article ID 640426, 5 pages http://dx.doi.org/10.1155/2015/640426

Upload: others

Post on 16-Aug-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Research Article Generalized Residual Entropy and Upper ...downloads.hindawi.com/journals/jps/2015/640426.pdf · Journal of Probability and Statistics J mv + 1 ln 3 exp K ( )# 4 1

Research ArticleGeneralized Residual Entropy and Upper Record Values

Suchandan Kayal

Department of Mathematics, National Institute of Technology Rourkela, Rourkela 769008, India

Correspondence should be addressed to Suchandan Kayal; [email protected]

Received 1 June 2015; Accepted 26 July 2015

Academic Editor: Ramon M. Rodrıguez-Dagnino

Copyright © 2015 Suchandan Kayal. This is an open access article distributed under the Creative Commons Attribution License,which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

In this communication, we deal with a generalized residual entropy of record values and weighted distributions. Some results onmonotone behaviour of generalized residual entropy in record values are obtained. Upper and lower bounds are presented. Further,based on this measure, we study some comparison results between a random variable and its weighted version. Finally, we describesome estimation techniques to estimate the generalized residual entropy of a lifetime distribution.

1. Introduction

There have been several attemptsmade by various researchersto generalize the Shannon entropy (see Shannon [1]) sinceits appearance in Bell System Technical Journal. For variousproperties and applications of the generalized entropy mea-sures, we refer to Kapur [2], Renyi [3], Tsallis [4], and Varma[5]. In this paper, we consider generalized residual entropydue to Varma [5]. Let 𝑋 be a nonnegative random variablerepresenting the lifetime of a system with an absolutely con-tinuous cumulative distribution function 𝐹(𝑥), probabilitydensity function 𝑓(𝑥), survival function 𝐹(𝑥)(= 1 − 𝐹(𝑥)),and hazard rate 𝜆

𝐹(𝑥). The generalized entropy of 𝑋 is given

by (see Varma [5])

𝑉𝛼,𝛽 (𝑋) = (

1𝛽 − 𝛼

) ln∫

0𝑓𝛼+𝛽−1

(𝑥) 𝑑𝑥,

0 ≤ 𝛽 − 1 < 𝛼 < 𝛽.

(1)

Measure (1) reduces to Renyi entropy (see Renyi [3]) when𝛽 = 1 and reduces to Shannon entropy (see Shannon [1])when 𝛽 = 1 and 𝛼 → 1. We often find some situationsin practice where the measure defined by (1) is not anappropriate tool to deal with uncertainty. For example, inreliability and life testing studies, sometimes it is requiredto modify the current age of a system. Here, one may beinterested to study the uncertainty of the random variable𝑋𝑡

= [𝑋 − 𝑡 | 𝑋 ≥ 𝑡]. The random variable 𝑋𝑡is dubbed

as the residual lifetime of a system which has survived up to

time 𝑡 ≥ 0 and is still working. Analogous to Ebrahimi [6],the generalized entropy of the residual lifetime𝑋

𝑡is given by

𝑉𝛼,𝛽 (𝑋; 𝑡) = (

1𝛽 − 𝛼

) ln∫

𝑡

𝑓𝛼+𝛽−1

(𝑥)

𝐹𝛼+𝛽−1

(𝑡)

𝑑𝑥,

0 ≤ 𝛽 − 1 < 𝛼 < 𝛽, 𝑡 ≥ 0,

(2)

which is also known as the generalized residual entropy. Itreduces to (1) when 𝑡 = 0. Also (2) reduces to Renyi’sresidual entropy (see Asadi et al. [7]) when 𝛽 = 1 andreduces to residual entropy (see Ebrahimi [6]) when 𝛽 = 1and 𝛼 → 1. Based on the generalized entropy measuresgiven in (1) and (2), several authors obtained various resultsin the literature. In this direction, we refer to Kayal [8–11],Kayal and Vellaisamy [12], Kumar and Taneja [13], and Satiand Gupta [14]. In this paper, we study some properties andcharacterizations of the generalized residual entropy given by(2) based on the upper record values.

Let {𝑋𝑛

: 𝑛 = 1, 2, . . .} be a sequence of identicallyand independently distributed nonnegative random variableshaving an absolutely continuous cumulative distributionfunction𝐹(𝑥), probability density function𝑓(𝑥), and survivalfunction 𝐹(𝑥). An observation 𝑋

𝑗in an infinite sequence

𝑋1, 𝑋2, . . . is said to be an upper record value if its valueis greater than that of all the previous observations. Forconvenience, we denote 𝑈1 = 1 and, for 𝑖 ≥ 2, 𝑈

𝑖=

min{𝑗; 𝑈𝑖−1 < 𝑗 and 𝑋

𝑗> 𝑋𝑈𝑖−1

}. Then, 𝑋𝑈1

, 𝑋𝑈2

, . . . iscalled a sequence of upper record values. The probability

Hindawi Publishing CorporationJournal of Probability and StatisticsVolume 2015, Article ID 640426, 5 pageshttp://dx.doi.org/10.1155/2015/640426

Page 2: Research Article Generalized Residual Entropy and Upper ...downloads.hindawi.com/journals/jps/2015/640426.pdf · Journal of Probability and Statistics J mv + 1 ln 3 exp K ( )# 4 1

2 Journal of Probability and Statistics

density function and the survival function of the 𝑛th upperrecord value 𝑋

𝑈𝑛are given by

𝑓𝑈𝑛

(𝑥) =𝐻𝑛−1

(𝑥)

(𝑛 − 1)!𝑓 (𝑥) , (3)

𝐹𝑈𝑛

(𝑥) =

𝑛−1∑

𝑗=0

𝐻𝑗(𝑥)

𝑗!𝐹 (𝑥) =

Γ (𝑛;𝐻 (𝑥))

Γ (𝑛), (4)

respectively, where 𝐻(𝑥) = − ln𝐹(𝑥) and Γ(𝑎; 𝑥) =

∫∞

𝑥𝑒−𝑢

𝑢𝑎−1

𝑑𝑢, 𝑎 > 0, 𝑥 ≥ 0. Note that Γ(𝑎; 𝑥) is known asincomplete gamma function. Also the hazard rate of 𝑋

𝑈𝑛is

𝜆𝐹𝑈𝑛

(𝑥) =𝐻𝑛−1

(𝑥) / (𝑛 − 1)!∑𝑛−1𝑗=0 𝐻𝑛 (𝑥) /𝑗!

𝜆𝐹 (𝑥) . (5)

Record values have wide spread applications in real life. Forthe applications of record values in destructive testing ofwooden beams and industrial stress testing, one may refer toGlick [15] and Ahmadi and Arghami [16]. Record values arealso useful in meteorological analysis and hydrology. For anextensive study of record values and applications, we refer toArnold et al. [17]. The paper is arranged as follows.

In Section 2, we obtain various properties on the gener-alized residual entropy. It is shown that the measure givenby (2) of the 𝑛th upper record value of any distributioncan be expressed in terms of that of the 𝑛th upper recordvalue from 𝑈(0, 1) distribution. Upper and lower bounds areobtained. Monotone behaviour of (2) based on the upperrecord values is investigated. In Section 3, based on (2),we study comparisons between a random variable and itsweighted version.We describe some estimation techniques toestimate the generalized residual entropy of a life distributionin Section 4. Some concluding remarks have been addedin Section 5. Throughout the paper, we assume that therandom variables are nonnegative. The terms increasingand decreasing stand for nondecreasing and nonincreasing,respectively.

2. Main Results

In this section, we study several properties of the generalizedresidual entropy given by (2) based on the upper recordvalues. First, we state the following lemma. The proof isstraightforward hence omitted.

Lemma 1. Let 𝑋𝑈∗

𝑛

denote the 𝑛th upper record value from asequence of independent observations from 𝑈(0, 1). Then,

𝑉𝛼,𝛽

(𝑋𝑈∗

𝑛

; 𝑡)

= (1

𝛽 − 𝛼) ln

Γ (𝛾 (𝑛 − 1) + 1, − ln (1 − 𝑡))

Γ𝛾 (𝑛; − ln (1 − 𝑡)).

(6)

In the following theorem, we show that the generalizedresidual entropy of the upper record value 𝑋

𝑈𝑛can be

expressed in terms of that of𝑋𝑈∗

𝑛

. Let𝑋 be a random variablehaving truncated gamma distribution with density function

𝑓 (𝑥 | 𝑎, 𝑏) =𝑏𝑎

Γ (𝑎, 𝑡)𝑒−𝑏𝑥

𝑥𝑎−1

,

𝑥 > 𝑡 ≥ 0, 𝑎 > 0, 𝑏 > 0.(7)

For convenience, we denote 𝑋 ∼ Γ𝑡(𝑎, 𝑏).

Theorem 2. The generalized residual entropy of 𝑋𝑈𝑛

can beexpressed as

𝑉𝛼,𝛽

(𝑋𝑈𝑛

; 𝑡)

= 𝑉𝛼,𝛽

(𝑋𝑈∗

𝑛

; 𝐹 (𝑡))

+ (1

𝛽 − 𝛼) ln [𝐸 (𝑓

𝛾−1(𝐹−1

(1 − 𝑒−𝑉𝑛)))] ,

(8)

where 𝑉𝑛∼ Γ−ln𝐹(𝑡)

(𝛾(𝑛 − 1) + 1, 1).

Proof. From (2), (3), and (4) and using the transformation𝑢 = − ln𝐹(𝑥), we obtain

𝑉𝛼,𝛽

(𝑋𝑈𝑛

; 𝑡) = (1

𝛽 − 𝛼)

⋅ ln∫

−ln𝐹(𝑡)

𝑓𝛾−1

(𝐹−1

(1 − 𝑒−𝑢

)) 𝑒−𝑢

𝑢(𝑛−1)𝛾

[Γ (𝑛; − ln𝐹 (𝑡))]𝛾

𝑑𝑢

= 𝑉𝛼,𝛽

(𝑋𝑈∗

𝑛

; 𝐹 (𝑡)) + (1

𝛽 − 𝛼)

⋅ ln [𝐸 (𝑓𝛾−1

(𝐹−1

(1 − 𝑒−𝑉𝑛)))] ,

(9)

since𝑉𝛼,𝛽

(𝑋𝑈∗

𝑛

; 𝐹(𝑡)) = Γ(𝛾(𝑛−1)+1, −ln𝐹(𝑡)).This completesthe proof.

As a consequence of Theorem 2, we get the followingremark.

Remark 3. Let 𝑋𝑈𝐸

𝑛

denote the 𝑛th upper record value froma sequence of independent observations from a standardexponential distribution. Then,

𝑉𝛼,𝛽

(𝑋𝑈𝐸

𝑛

; 𝑡) = (1

𝛽 − 𝛼) ln

Γ (𝛾 (𝑛 − 1) + 1, 𝑡)Γ𝛾 (𝑛, 𝑡)

+ (1

𝛽 − 𝛼) ln𝐸 (𝑒

−(𝛾−1)𝑍) ,

(10)

where 𝑍 ∼ Γ𝑡(𝛾(𝑛 − 1) + 1, 1).

In Table 1, we obtain expressions of 𝑉𝛼,𝛽

(𝑋𝑈𝑛

; 𝑡) forWeibull and Pareto distributions. It is easy to show that𝜆𝐹𝑈𝑛

(𝑡)/𝜆𝐹(𝑡) and 𝑓

𝑈𝑛(𝑡)/𝑓(𝑡) are increasing functions in 𝑡 ≥

0 (see Li and Zhang [18]). Therefore, we have the followingtheorem whose proof follows along the lines similar to thoseinTheorem 8 of Kayal [11].

Page 3: Research Article Generalized Residual Entropy and Upper ...downloads.hindawi.com/journals/jps/2015/640426.pdf · Journal of Probability and Statistics J mv + 1 ln 3 exp K ( )# 4 1

Journal of Probability and Statistics 3

Table 1: Expressions of 𝑉𝛼,𝛽

(𝑋𝑈𝑛

; 𝑡).

Probability density functions (PDF) 𝑉𝛼,𝛽

(𝑋𝑈𝑛

; 𝑡)

𝑓 (𝑥 | 𝜃) = 𝜃𝑥𝜃−1

𝑒−𝑥𝜃

, 𝑥 > 0, 𝜃 > 0 (1

𝛽 − 𝛼) ln

Γ(𝑛𝛾 − ((𝛾 − 1)/𝜃); 𝑡𝜃)Γ𝛾(𝑛; 𝑡𝜃)

+ (𝛾 − 1𝛽 − 𝛼

) ln 𝜃 − (𝑛𝛾 − (1/𝜃)(𝛾 − 1)

𝛽 − 𝛼) ln 𝛾

𝑓 (𝑥 | 𝜃, 𝛿) =𝜃𝛿𝜃

𝑥𝜃+1, 𝑥 ≥ 𝛿 > 0, 𝜃 > 0 (

1𝛽 − 𝛼

) ln((𝜃/𝛿)𝛾−1

Γ(𝛾(𝑛 − 1) + 1; −𝜃 ln(𝛿/𝑡))((1 + (1/𝜃))(𝛾 − 1) + 1)𝛾(𝑛−1)+1Γ𝛾(𝑛; −𝜃 ln(𝛿/𝑡))

)

Table 2: Bounds of 𝑉𝛼,𝛽

(𝑋𝑈𝑛

; 𝑡).

PDF Bounds

𝑓 (𝑥 | 𝜃, 𝛿) =𝜃𝛿𝜃

𝑥𝜃+1, 𝑥 ≥ 𝛿 > 0, 𝜃 > 0

≤ 𝐵𝑟(𝑡) + (

1𝛽 − 𝛼

) ln[𝜃𝛾𝛿𝜃(𝛾−1)

(𝜃 + 1) (𝛾 − 1) 𝑡(𝜃+1)(𝛾−1)]

≤ (≥)𝑉𝛼,𝛽

(𝑋𝑈∗𝑛; 𝑡) + (

𝛾 − 1𝛽 − 𝛼

) ln(𝜃

𝛿), for 𝛼 + 𝛽 > (<)2

𝑓 (𝑥 | 𝜃) =1𝜃exp(

−𝑥

𝜃), 𝑥 > 0, 𝜃 > 0 ≤ 𝐵

𝑟(𝑡) +

1𝛽 − 𝛼

ln(𝜃1−𝛾

𝛾 + 1exp(

−𝑡 (𝛾 + 1)

𝜃))

≤ (≥)𝑉𝛼,𝛽

(𝑋𝑈∗𝑛; 𝑡) −

𝛾 − 1𝛽 − 𝛼

ln(𝜃), for 𝛼 + 𝛽 > (<)2

Theorem 4. The 𝑛th upper record value 𝑋𝑈𝑛

is increasing(decreasing) generalized residual entropy (IGRE (DGRE)) if 𝑋is IGRE (DGRE).

Note that for 𝑛th and (𝑛 + 1)th upper record values,𝜆𝐹𝑈𝑛+1

(𝑡)/𝜆𝐹𝑈𝑛

(𝑡) is increasing in 𝑡 ≥ 0 (see Kochar [19])

and hence 𝑋𝑈𝑛

𝑙𝑟

≤ 𝑋𝑈𝑛+1

. Therefore, the following corollaryimmediately follows fromTheorem 4.

Corollary 5. The (𝑛 + 1)th upper record value 𝑋𝑈𝑛+1

is IGRE(DGRE) if 𝑛th upper record value 𝑋

𝑈𝑛is IGRE (DGRE).

The following theorem provides bounds for the general-ized residual entropy of the 𝑛th upper record 𝑋

𝑈𝑛. We omit

the proof as it follows in an analogous approach similar tothat of Theorem 11 given in Kayal [11].

Theorem 6. (a) Let 𝑉𝛼,𝛽

(𝑋𝑈𝑛

; 𝑡) be finite. If 𝑚𝑟= max{𝛾(𝑛 −

1), − ln𝐹(𝑡)} is the mode of Γ−ln𝐹(𝑡)(𝛾(𝑛 − 1) + 1, 1), then

𝑉𝛼,𝛽

(𝑋𝑈𝑛

; 𝑡) ≤ 𝐵𝑟 (𝑡)

+ (1

𝛽 − 𝛼) ln∫

𝑡

𝜆𝐹 (𝑥) 𝑓

𝛾−1(𝑥) 𝑑𝑥,

(11)

where 𝐵𝑟(𝑡) = −(𝛾/(𝛽 − 𝛼)) ln Γ(𝑛; − ln𝐹(𝑡)) + (𝛾(𝑛 − 1)/(𝛽 −

𝛼)) ln𝑚𝑟− (1/(𝛽 − 𝛼))𝑚

𝑟.

(b) Let 𝑀 = 𝑓(𝑚) < ∞, where 𝑚 is the mode of thedistribution with density function 𝑓(𝑥). Also 𝑋

𝑈∗

𝑛

denote the𝑛th upper record value from a sequence of observation from𝑈(0, 1). Then, for 𝛼 + 𝛽 > (<)2,

𝑉𝛼,𝛽

(𝑋𝑈𝑛

; 𝑡) ≤ (≥)𝑉𝛼,𝛽 (𝑋𝑈∗𝑛

; 𝑡) + (𝛾 − 1𝛽 − 𝛼

) ln𝑀. (12)

As an application of Theorem 6, we obtain bounds of Paretoand exponential distributions presented in Table 2. Thefollowing theorem gives the monotone behaviour of the

generalized residual entropy of upper record values in termsof 𝑛.

Definition 7. Let 𝑋 and 𝑌 be two nonnegative randomvariables with survival functions 𝐹(𝑥) and 𝐺(𝑥), respectively.Then, 𝑋 is said to be smaller than 𝑌 in the usual stochasticordering, denoted by 𝑋

st≤ 𝑌, if 𝐹(𝑥) ≤ 𝐺(𝑥), for all 𝑥 ≥ 0.

Theorem 8. Let 𝑋𝑈1

, 𝑋𝑈2

, . . . be a sequence of upper recordvalues from a distribution with cumulative distribution func-tion 𝐹(𝑥) and probability density function 𝑓(𝑥). Also let 𝑓(𝑥)

be an increasing function. Then, 𝑉𝛼,𝛽

(𝑋𝑈𝑛

; 𝑡) is increasing(decreasing) in 𝑛 for 𝛼 + 𝛽 > (<)2.

Proof. From (8), we have

𝑉𝛼,𝛽

(𝑋𝑈𝑛+1

; 𝑡) −𝑉𝛼,𝛽

(𝑋𝑈𝑛

; 𝑡)

= 𝑉𝛼,𝛽

(𝑋𝑈∗

𝑛+1; 𝐹 (𝑡)) −𝑉

𝛼,𝛽(𝑋𝑈∗

𝑛

; 𝐹 (𝑡))

+(1

𝛽 − 𝛼) ln

𝐸 (𝑓𝛾−1

(𝐹−1

(1 − 𝑒−𝑉𝑛+1)))

𝐸 (𝑓𝛾−1 (𝐹−1 (1 − 𝑒−𝑉𝑛))).

(13)

Moreover, for 𝛼 + 𝛽 > (<)2,

∫∞

−ln𝐹(𝑡) 𝑒−𝑥

𝑥𝛾(𝑛−1)

𝑑𝑥

∫∞

−ln𝐹(𝑡) 𝑒−𝑥𝑥(𝑛−1)𝑑𝑥

(14)

is an increasing (decreasing) function in 𝑡. Therefore, for𝑢(𝑥) = 𝑥 and V

𝛾(𝑥) = 𝑒

−𝑥𝑥−𝛾, we have 𝑊

𝛾

st≥ (

st≤)𝑊1 for

𝛼+𝛽 > (<)2. Hence, fromTheorem 12 of Kayal [11], it can beproved that 𝑉

𝛼,𝛽(𝑋𝑈∗

𝑛

; 𝐹(𝑡)) is increasing (decreasing) for 𝛼 +

𝛽 > (<)2. Now, along the lines of the proof ofTheorem 3.7 ofZarezadeh and Asadi [20], the proof follows. This completesthe proof of the theorem.

Page 4: Research Article Generalized Residual Entropy and Upper ...downloads.hindawi.com/journals/jps/2015/640426.pdf · Journal of Probability and Statistics J mv + 1 ln 3 exp K ( )# 4 1

4 Journal of Probability and Statistics

3. Weighted Distributions

To overcome the difficulty to model nonrandomized data setin environmental and ecological studies, Rao [21] introducedthe concept of weighted distributions. Let the probabilitydensity function of 𝑋 be 𝑓(𝑥), and let 𝑤(𝑥) be the nonneg-ative function with 𝜇

𝑤= 𝐸(𝑤(𝑋)) < ∞. Also let 𝑓

𝑤(𝑥) and

𝐹𝑤(𝑥), respectively, be the probability density function and

survival function of the weighted random variable𝑋𝑤, which

are given by

𝑓𝑤 (𝑥) =

𝑤 (𝑥) 𝑓 (𝑥)

𝜇𝑤

, (15)

𝐹𝑤 (𝑥) =

𝐸 (𝑤 (𝑋) | 𝑋 ≥ 𝑥) 𝐹 (𝑥)

𝜇𝑤

. (16)

The hazard rate of 𝑋𝑤is

𝜆𝐹𝑤

(𝑥) =𝑤 (𝑥)

𝐸 (𝑤 (𝑋) | 𝑋 ≥ 𝑥)𝜆𝐹 (𝑥) . (17)

For some results and applications on weighted distributions,one may refer to Di Crescenzo and Longobardi [22], Guptaand Kirmani [23], Kayal [9], Maya and Sunoj [24], Navarro etal. [25], and Patil [26]. In the present section, we obtain somecomparison results based on the generalized residual entropybetween a random variable and its weighted version.We needthe following definition in this direction.

Definition 9. A random variable 𝑋 with hazard rate 𝜆𝐹(𝑥)

is said to have a decreasing (increasing) failure rate (DFR(IFR)), if 𝜆

𝐹(𝑡) is decreasing (increasing) in 𝑡 ≥ 0.

Theorem 10. Let 𝑋 and 𝑌 be two random variables withcumulative distribution functions 𝐹(𝑥) and 𝐺(𝑥), probabilitydensity functions 𝑓(𝑥) and 𝑔(𝑥), survival functions 𝐹(𝑥)

and 𝐺(𝑥), and hazard rates 𝜆𝐹(𝑥) and 𝜆

𝐺(𝑥), respectively. If

𝜆𝐹(𝑡) ≤ 𝜆

𝐺(𝑡), for all 𝑡 ≥ 0, and either 𝐹(𝑥) or 𝐺(𝑥) is DFR,

then 𝑉𝛼,𝛽

(𝑋; 𝑡) ≤ (≥)𝑉𝛼,𝛽

(𝑌; 𝑡), for 𝛼 + 𝛽 > (<)2.

Proof. The proof follows along the lines of that of Theorem 4of Asadi et al. [7].

Theorem 11. Under the assumptions of Theorem 10, if 𝜆𝐹(𝑡) ≥

𝜆𝐺(𝑡), for all 𝑡 ≥ 0, and either 𝐹(𝑥) or 𝐺(𝑥) is DFR, then

𝑉𝛼,𝛽

(𝑋; 𝑡) ≥ (≤)𝑉𝛼,𝛽

(𝑌; 𝑡), for 𝛼 + 𝛽 > (<)2.

Proof. Proof follows from that of Theorem 4 of Asadi et al.[7].

Theorem 12. (a) Suppose 𝐸(𝑤(𝑋) | 𝑋 ≥ 𝑡), or 𝑤(𝑡), isdecreasing. If 𝑋 or 𝑋

𝑤is DFR, then, for all 𝑡 ≥ 0, 𝑉

𝛼,𝛽(𝑋; 𝑡) ≤

(≥)𝑉𝛼,𝛽

(𝑋𝑤; 𝑡), for 𝛼 + 𝛽 > (<)2.

(b) Suppose 𝐸(𝑤(𝑋) | 𝑋 ≥ 𝑡), or 𝑤(𝑡), is increasing. If 𝑋or 𝑋𝑤is DFR, then, for all 𝑡 ≥ 0, 𝑉

𝛼,𝛽(𝑋; 𝑡) ≥ (≤)𝑉

𝛼,𝛽(𝑋𝑤; 𝑡),

for 𝛼 + 𝛽 > (<)2.

Proof. It is not difficult to see that 𝜆𝐹(𝑡) ≤ 𝜆

𝐹𝑤(𝑡), for all 𝑡 ≥ 0,

when either 𝐸(𝑤(𝑋) | 𝑋 ≥ 𝑡) or 𝑤(𝑡) is decreasing. Now,

the proof of part (a) follows from Theorem 10. Part (b) canbe proved similarly. This completes the proof of the theorem.

Let 𝑋 be a random variable with density function 𝑓(𝑥)

and cumulative distribution function 𝐹(𝑥). Also let 𝜇𝑥

=

𝐸(𝑋) > 0 be finite. Denote the length biased version of 𝑋

by 𝑋𝐿. Then, the probability density function of 𝑋

𝐿is given

by

𝑓𝐿 (𝑥) =

𝑥𝑓 (𝑥)

𝜇𝑥

. (18)

The randomvariable𝑋𝐿arises in the study of lifetime analysis

and various probability proportional-to-size sampling prop-erties. Associated with a random variable 𝑋, one can defineanother random variable 𝑋

𝐸with density function

𝑓𝐸(𝑥) =

𝐹 (𝑥)

𝜇𝑥

. (19)

This distribution is known as equilibrium distribution of 𝑋.

The random variables 𝑋𝐿and 𝑋

𝐸are weighted versions of

𝑋 with weight function 𝑤𝐿(𝑥) = 𝑥 and 𝑤

𝐸(𝑥) = 1/𝜆

𝐹(𝑥),

respectively. The following corollary is a consequence ofTheorem 12.

Corollary 13. Let 𝑋 be DFR. Then, for all 𝑡 ≥ 0,

(a) 𝑉𝛼,𝛽

(𝑋; 𝑡) ≥ (≤)𝑉𝛼,𝛽

(𝑋𝐿; 𝑡) for 𝛼 + 𝛽 > (<)2;

(b) 𝑉𝛼,𝛽

(𝑋; 𝑡) ≥ (≤)𝑉𝛼,𝛽

(𝑋𝐸; 𝑡) for 𝛼 + 𝛽 > (<)2.

4. Estimation

In this section, we discuss the problem of estimation ofthe generalized residual entropy of a statistical distributionbased on upper record values. Here, we consider exponentialdistribution. It has various applications in practice. Let 𝑋

follow exponential distribution with mean 𝜆. Then, from (2),we obtain

𝑉𝛼,𝛽 (𝑋; 𝑡) = (

𝛼 + 𝛽 − 1𝛽 − 𝛼

) ln 𝜆

−(1

𝛽 − 𝛼) ln (𝛼 + 𝛽− 1) .

(20)

Based on the 𝑛 upper record values, the maximum likelihoodestimator (mle) of 𝜆 can be obtained as 𝛿 = 𝑋

𝑈𝑛/𝑛, where

𝑋𝑈𝑛

is the 𝑛th upper record value. Now, applying invarianceproperty, we obtain the mle of 𝑉

𝛼,𝛽(𝑋; 𝑡) as

��ml𝛼,𝛽

= (𝛼 + 𝛽 − 1𝛽 − 𝛼

) ln(

𝑋𝑈𝑛

𝑛)

−(1

𝛽 − 𝛼) ln (𝛼 + 𝛽− 1) .

(21)

Also the uniformly minimum variance unbiased estimator(umvue) of 𝑉

𝛼,𝛽(𝑋; 𝑡) can be obtained as

Page 5: Research Article Generalized Residual Entropy and Upper ...downloads.hindawi.com/journals/jps/2015/640426.pdf · Journal of Probability and Statistics J mv + 1 ln 3 exp K ( )# 4 1

Journal of Probability and Statistics 5

��mv𝛼,𝛽

= (𝛼 + 𝛽 − 1𝛽 − 𝛼

) ln(

𝑋𝑈𝑛

exp (𝜓 (𝑛)))

−(1

𝛽 − 𝛼) ln (𝛼 + 𝛽− 1) ,

(22)

where 𝜓(𝑛) is a digamma function. To illustrate the esti-mation techniques developed in this section, we considersimulated data from exponential distribution with mean 1.In this purpose, we use Monte-Carlo simulation.

Example 14. In this example, we consider a simulated sampleof size 𝑛 = 5 from the exponential distributionwithmean 0.5.The simulated upper records are as follows:

0.265410, 0.637725, 0.688878, 0.791721,

2.114831.(23)

Based on these upper record values, we have ��ml𝛼,𝛽

= −6.64759and ��

mv𝛼,𝛽

= −6.06166 when 𝛼 = 1.2 and 𝛽 = 1.5.

5. Concluding Remarks

In this paper, we consider generalized residual entropy dueto Varma [5] of record values and weighted distributions. Weobtain some results on monotone behaviour of this measurein upper record values. Some bounds are obtained. Further,some comparison results between a random variable and itsweighted version based on the generalized residual entropyare studied. Finally, two estimators of the generalized residualentropy of exponential distribution have been described.

Conflict of Interests

The author declares that there is no conflict of interestsregarding the publication of this paper.

References

[1] C. E. Shannon, “Amathematical theory of communication,”TheBell System Technical Journal, vol. 27, no. 3, pp. 379–423, 1948.

[2] J. N. Kapur, “Generalized entropy of order alpha and type beta,”in Proceedings of the Mathematical Seminar, pp. 78–94, NewDelhi, India, 1967.

[3] A. Renyi, “On measures of entropy and information,” inProceedings of the 4th Berkeley Symposium on Mathematics,Statistics and Probability 1960, vol. 1, pp. 547–461, University ofCalifornia Press, Berkeley, Calif, USA, 1961.

[4] C. Tsallis, “Possible generalization of Boltzmann-Gibbs statis-tics,” Journal of Statistical Physics, vol. 52, no. 1-2, pp. 479–487,1988.

[5] R. S. Varma, “Generalization of Renyi’s entropy of order 𝛼,”Journal of Mathematical Sciences, vol. 1, pp. 34–48, 1966.

[6] N. Ebrahimi, “How to measure uncertainty in the residual lifetime distribution,” Sankhya, vol. 58, no. 1, pp. 48–57, 1996.

[7] M. Asadi, N. Ebrahimi, and E. S. Soofi, “Dynamic generalizedinformation measures,” Statistics & Probability Letters, vol. 71,no. 1, pp. 85–98, 2005.

[8] S. Kayal, “On generalized dynamic survival and failure entropiesof order (𝛼,𝛽),” Statistics and Probability Letters, vol. 96, pp. 123–132, 2015.

[9] S. Kayal, “Some results on dynamic discrimination measures oforder (𝛼, 𝛽),” Hacettepe Journal of Mathematics and Statistics,vol. 44, pp. 179–188, 2015.

[10] S. Kayal, “Characterization based on generalized entropy oforder statistics,” Communications in Statistics—Theory andMethods. In press.

[11] S. Kayal, “Some results on a generalized residual entropy basedon order statistics,” Statistica. In press.

[12] S. Kayal and P. Vellaisamy, “Generalized entropy properties ofrecords,” Journal of Analysis, vol. 19, pp. 25–40, 2011.

[13] V. Kumar and H. C. Taneja, “Some characterization results ongeneralized cumulative residual entropy measure,” Statistics &Probability Letters, vol. 81, no. 8, pp. 1072–1077, 2011.

[14] M. M. Sati and N. Gupta, “On partial monotonic behaviour ofVarma entropy and its application in coding theory,” Journal ofthe Indian Statistical Association. In press.

[15] N.Glick, “Breaking records and breaking boards,”TheAmericanMathematical Monthly, vol. 85, no. 1, pp. 2–26, 1978.

[16] J. Ahmadi and N. R. Arghami, “Comparing the Fisher informa-tion in record values and iid observations,” Statistics, vol. 37, no.5, pp. 435–441, 2003.

[17] B. C. Arnold, N. Balakrishnan, and H. N. Nagaraja, Records,John Wiley & Sons, New York, NY, USA, 1998.

[18] X. Li and S. Zhang, “Some new results on Renyi entropy ofresidual life and inactivity time,” Probability in the Engineeringand Informational Sciences, vol. 25, no. 2, pp. 237–250, 2011.

[19] S. C. Kochar, “Some partial ordering results on record values,”Communications in Statistics—Theory and Methods, vol. 19, no.1, pp. 299–306, 1990.

[20] S. Zarezadeh and M. Asadi, “Results on residual Renyi entropyof order statistics and record values,” Information Sciences, vol.180, no. 21, pp. 4195–4206, 2010.

[21] C. R. Rao, “On discrete distributions arising out of methods ofascertainment,” Sankhya, vol. 27, pp. 311–324, 1965.

[22] A.Di Crescenzo andM. Longobardi, “Onweighted residual andpast entropies,” ScientiaeMathematicae Japonicae, vol. 64, no. 2,pp. 255–266, 2006.

[23] R. C. Gupta and S. N. U. A. Kirmani, “The role of weighteddistributions in stochastic modeling,” Communications inStatistics—Theory and Methods, vol. 19, no. 9, pp. 3147–3162,1990.

[24] S. S. Maya and S. M. Sunoj, “Some dynamic generalizedinformation measures in the context of weighted models,”Statistica, vol. 68, no. 1, pp. 71–84 (2009), 2008.

[25] J. Navarro, Y. del Aguila, and J. M. Ruiz, “Characterizationsthrough reliability measures from weighted distributions,” Sta-tistical Papers, vol. 42, no. 3, pp. 395–402, 2001.

[26] G. P. Patil, “Weighted distributions,” in Encyclopedia of Envi-ronmetrics, A. H. El-Shaarawi and W. W. Piegorsch, Eds., JohnWiley & Sons, Chichester, UK, 2002.

Page 6: Research Article Generalized Residual Entropy and Upper ...downloads.hindawi.com/journals/jps/2015/640426.pdf · Journal of Probability and Statistics J mv + 1 ln 3 exp K ( )# 4 1

Submit your manuscripts athttp://www.hindawi.com

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttp://www.hindawi.com

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Journal of

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

CombinatoricsHindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

International Journal of

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

The Scientific World JournalHindawi Publishing Corporation http://www.hindawi.com Volume 2014

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttp://www.hindawi.com

Volume 2014 Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Stochastic AnalysisInternational Journal of