uncertainty, information and time-frequency distributionshero/preprints/uncertainty.pdf · the...

13
Uncertainty, information and time-frequency distributions * William J. Williams, Mark L. Brown, and Alfred 0. Hero III Department of Electrical Engineering and Computer Science The University of Michigan, Ann Arbor, MI 48109, (313) 764-8516 ABSTRACT The well-known uncertaintly principle is of- ten invoked in signal processing. It is also often considered to have the same implications in sig- nal analysis as does the uncertainty principle in quantum mechanics. The uncertainty principle is often incorrectly interpreted to mean that one cannot locate the time-frequency coordi- nates of a signal with arbitrarily good precision, since, in quantum mechanics, one can not de- termine the position and momentum of a parti- cle with arbitrarily good precision. Rényi infor- mation of the third order is used to provide an information measure on time-frequency distri- butions. The results suggest that even though this new measure tracks time-bandwidth re- sults for two Gabor logons separated in time and/or frequency, the information measure is more general and provides a quantitative as- sessment of the number of resolvable compo- nents in a time frequency representation. As such the information measure may be useful as a tool in the design and evaluation of time- frequency distributions. 1. TIME-FREQUENCY UNCERTAINTY Time frequency uncertainty is defined in var- ious ways with slightly different results. Ca- *This research was supported in part by grants from the Rackham School of Graduate Studies and the Of- fice of Naval Research, ONR contracts no. N00014-89- J-1723 and N00014-90-J-1654 bor's 10 definition will be used for the purposes of this paper. It is: (1.1) The proper interpretation3 of the Lt Lw is that they are the standard deviations (re- spectively)of the time and frequency marginals of the time-frequency distribution. This as- sumes that the time and frequency margins are 'correct', in that they reflect the instantaneous power and energy spectrum of the signal. If a signal is stretched in time its Fourier trans- form representation is compressed inversely in frequency. Thus the Lit Lw product remains constant. It is correct that one cannot indepen- dently reduce the time spread and frequency spread simultaneously. These quantities are tightly linked via the scaling property of the Fourier transform. That is, F{s(at)] = a a 1.1 Representation, localization resolution (1.2) and It is certainly true that the concept of time- bandwidth product (TBP) is useful in charac- terizing the concentration of energy in time and frequency. It may seem that the most concen- trated signal offers the best localization in time- frequency. This is not true. Skolnik14 points out that there are many signals which have large 144 / SPIE Vol. 1566 Advanced Signal Processing Algorithms, Architectures, and Implementations 11(1991) 0-8194-0694-5/91 /$4.00

Upload: others

Post on 24-Feb-2021

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Uncertainty, information and time-frequency distributionshero/Preprints/Uncertainty.pdf · The uncertainty principle is often incorrectly interpreted to mean that ... terizing the

Uncertainty, information and time-frequency distributions *

William J. Williams, Mark L. Brown, and Alfred 0. Hero III

Department of Electrical Engineering and Computer Science

The University of Michigan, Ann Arbor, MI 48109, (313) 764-8516

ABSTRACTThe well-known uncertaintly principle is of-

ten invoked in signal processing. It is also oftenconsidered to have the same implications in sig-nal analysis as does the uncertainty principle inquantum mechanics. The uncertainty principleis often incorrectly interpreted to mean thatone cannot locate the time-frequency coordi-nates of a signal with arbitrarily good precision,since, in quantum mechanics, one can not de-termine the position and momentum of a parti-cle with arbitrarily good precision. Rényi infor-mation of the third order is used to provide aninformation measure on time-frequency distri-butions. The results suggest that even thoughthis new measure tracks time-bandwidth re-sults for two Gabor logons separated in timeand/or frequency, the information measure ismore general and provides a quantitative as-sessment of the number of resolvable compo-nents in a time frequency representation. Assuch the information measure may be usefulas a tool in the design and evaluation of time-frequency distributions.

1. TIME-FREQUENCYUNCERTAINTY

Time frequency uncertainty is defined in var-ious ways with slightly different results. Ca-

*This research was supported in part by grants fromthe Rackham School of Graduate Studies and the Of-fice of Naval Research, ONR contracts no. N00014-89-J-1723 and N00014-90-J-1654

bor's 10 definition will be used for the purposesof this paper. It is:

(1.1)

The proper interpretation3 of the Lt Lwis that they are the standard deviations (re-spectively)of the time and frequency marginalsof the time-frequency distribution. This as-sumes that the time and frequency margins are'correct', in that they reflect the instantaneouspower and energy spectrum of the signal. Ifa signal is stretched in time its Fourier trans-form representation is compressed inversely infrequency. Thus the Lit Lw product remainsconstant. It is correct that one cannot indepen-dently reduce the time spread and frequencyspread simultaneously. These quantities aretightly linked via the scaling property of theFourier transform. That is,

F{s(at)] =a a

1.1 Representation, localizationresolution

(1.2)

and

It is certainly true that the concept of time-bandwidth product (TBP) is useful in charac-terizing the concentration of energy in time andfrequency. It may seem that the most concen-trated signal offers the best localization in time-frequency. This is not true. Skolnik14 points outthat there are many signals which have large

144 / SPIE Vol. 1566 Advanced Signal Processing Algorithms, Architectures, and Implementations 11(1991) 0-8194-0694-5/91 /$4.00

Page 2: Uncertainty, information and time-frequency distributionshero/Preprints/Uncertainty.pdf · The uncertainty principle is often incorrectly interpreted to mean that ... terizing the

TBPs, yet offer excellent localization proper-ties. The problem is that ambiguities arise.

One may use a pair of chirps in time-frequency. One chirp may rise linearly in Ire-quency (up-chirp) while the other falls linearlyin frequency (down-chirp). Each chirp will pro-duce a thin "knife edge" in time frequency. Thetime frequency distribution will be in anform due to the crossing of the up-chirp anddown-chirp. The crossing point defines boththe frequency and time location of the "center"of the compound signal and it defines these 1-cations quite precisely. This definition may bemade ever more precise by increasing the timeand frequency extent of the chirps. Thus wehave the supposedly paradoxical result that in-creasing the TBP improves time and frequencylocalization. There is a price to pay, however.With multiple versions of these "X" s there maybe overlap of the chirps from the different ver-sions. The problem is to distinguish the "true"crossings which yield the time-frequency local-izations of each "X" from the "false" crossingsdue to overlap of the "X"s. Ambiguities havebeen introduced.

Resolution of signals is another important as-pect of signal processing. It is here that TBPand uncertainty in the signal processing contextmakes sense. The concept of an elementary sig-nal is needed. There are several possibilities,prolate spheriodal,15'16 Hermit functions 6 andspecial compact bases11 are among them. Ga-bor logons will be used in this paper due totheir minimum uncertainty properties. If thereare two elementary signals how much separa-tion between these two elementary signals mustone achieve in order to be able to conclude thatthere are two signals present rather than one?One would like an objective count of the num-ber of resolvable elementary signals present.

An information theoretic approach has someappeal in this context. Such an approach wouldproduce a result expressed in "bits". One el-

ementary signal would yield zero bits of infor-mation (20) two well separated elementary sig-nals would yield one bit of information (21)four well separated elementary signals wouldyield two bits of information (22), and so on.The strategy for this paper is to apply informa-tion measures to time-frequency distributions.Unfortunately, the well known Shannon Infor-mation13 can not be applied to some time-frequency distributions due to their negativeenergy . There is a generalized formof information due to Rényi which admits neg-ative values in the distribution. This allows atreatment of many time-frequency distributionsin terms of information measures even thoughRényi formally requires that the probabilitiesbe non-negative in his formulation. As will beseen later in this paper violation of that ruledoes not prevent "reasonable" answers from be-ing obtained.

1.2 Time-frequency distributionsThere are many types of time-frequency dis-

tributions. For purposes of this paper the dis-tributions utilized will be members of a generalclass of distributions commonly called Cohen'sclass3. Cohen's class is defined as3

C3(t,w;çb) 'fJJei(0tTu)q(O,r)s(u +

— )dudrdO, (1.3)

and has been frequently used as a unified frame-work in time-frequency signal analysis. Thekernel q(O, r) determines the distribution char-acteristics. Different kernels produce differentdistributions3 , such as the WD, spectrogramExponential (ED)2 and the RID.9'17'18'19 Thekernel has a unity value for the WD. The RID(Reduced Interference Distribution) enjoys al-most all of the nice properties of the WD, buthas the advantage of greatly reducing cross-term interference that often plagues the WD

SPIE Vol. 1566 Advanced Signal Processing Algorithms, Architectures, and Implementations /1(1991) / 145

Page 3: Uncertainty, information and time-frequency distributionshero/Preprints/Uncertainty.pdf · The uncertainty principle is often incorrectly interpreted to mean that ... terizing the

t — — ____ —t2/c2iro

Power is normalized to 1:

L00ptdt= Je_t2k2dt= 1

The Fourier transform of s(t) is:

1 r008(w) = — J s(t) e_iWt dt2ir

= (c2fr)1/4 e_22uhl2

Energy density in the frequency domain (thefrequency marginal) is:

P(w) = S2(w) = e22Total energy in the frequency domain is alsonormalized:

E P(w) dw = e22 dw = 1 (1.9)

Equation 1.1 takes the equality value in thiscase as can readily be seen by determining theproduct of the standard deviations of the timeand frequency marginals. The time and fre—quency marginals are shown in Figure la,b.

1p00W(t,w) = — / s(t+T/2)2ir j—oo

E1 W(t, w) dw100 00

(1.6) = e2dw Jet2dt=1

146 / SPIE Vol. 1566 Advanced Signal Processing Algorithms, Architectures, and Implementations II (1991)

when it is used to study multicomponent sig-nals.

1.3 The Gabor logonThe Gabor logon1° is a popular candidate

for representation of the elementary signal ele-ment. It is possible to expand signals in termsof such elementary signals and such an ap-proach has been useful in the detection of tran-sient signals.7 Using a Gabor logon defined as:

s(t) = _e_t2/22The instantaneous power (the time marginal)is:

(1.5)

1.4 Wigner distribution for one Gaborlogon

The Wigner distribution (WD) for the GaborLogon with a 2 of one is, using the standarddefinition3:

s*(t T/2)e_Wt dr

(1.4) = _e2et2 (1.10)

The Wigner distribution is also normalized; en-ergy obtained by integrating over all time andfrequency is 1:

(1.11)

This would be true for other values of cr2 aswell. The WD of the Gabor logon is shownin Figure ic. Note that the distributions must

(1 .7) always be normalized for the purposes of thisstudy. We wish to treat the distributions us-ing tools usually applied to probability densityfunctions, hence they must be properly normal-ized.

(1.8) 2. INFORMATION MEASURESInformation is most often related to random

events. Distributions considered in the infor-mation context are usually distributions of ran-dom variables. In this paper we wish to appro-priate information concepts and apply them todistributions which may be derived from de-terministic signals. This is done in order tobenefit from certain very desirable propertiesof information theoretic measures. There aresix properties of information that are desirable.For the purposes of this paper the properties of

Page 4: Uncertainty, information and time-frequency distributionshero/Preprints/Uncertainty.pdf · The uncertainty principle is often incorrectly interpreted to mean that ... terizing the

additivity, continuity and symmetry are partic- 1 '°°e_t2k2

ularly useful. A brief presentation of Shannon J_ooand Rényi's information formulations and their [ e_t21f72]

dtrelationship sets the stage for the application of log2these ideas.

log2ir log2eShannon Information Definition 2 2 log2 (3.1)

The definition of Shannon information forInformation for the frequency marginal is:continuous functions is:

Ix = f(x)log2f(x)dx (2.1) I = — I P(w)log2P(w)dw—00 J—oo

The definition of Shannon information for dis- 1-SI— I e w2c72crete functions is: v

I = —>;Pi log2 p2 (2.2) log2 [e_w22] dw

Rényi Information Definitionlog2ir log2eThe general defintion for order Rényi in-

2 2log2 o (3.2)

formation of a continuous function is:Information in the Wigner distribution is:1 too

R = log2] f°(x)dx (2.3)1—cr oo ooItf = — f_ I W(t,w) log2 W(t, w)dtdwooJ — ooThe discrete equivalent is:

1 oo oo1 = I I e_et/U2R = 1log2p (2.4) ir -oo

log2 (!e_w2U2e_t2k2)

dtdwFirst order Rényi Information reduces to Shan- \.rnon Information for a = 1 . Third order Rényi = log2 ir + log2 e (3.3)Information is defined as:

oo Thus the differential information is zero:R = —iog2f f3(x)dx (2.5)oo

'dif Itf — (I + I) = 0.0 (3.4)3. SHANNON INFORMATION OFTHE DISTRIBUTIONS The Wigner distribution provides no additional

Now we wish to compare the information in information about the signal above and beyondthe marginals against the information in the that provided by the marginals.time-frequency distribution. Using Shannon'sinformation definition, information for the time 4. RENYI INFORMATION FORmarginal is: THE DISTRIBUTIONS

coo A similar result is found using third orderI = —J p(t) log2 p(t)dt Rényi information. For the time marginal the

SPIE Vol. 1566 Advanced Signal Processing Algorithms, Architectures, and Implementations 11(1991) / 147

Page 5: Uncertainty, information and time-frequency distributionshero/Preprints/Uncertainty.pdf · The uncertainty principle is often incorrectly interpreted to mean that ... terizing the

e''2°2 dw

log2ir log2 3=2

+4 —log2a

The Rényi information for the Wignerbution is:

1 f00 100 1

= —log2J J ——00 —00

e"2°2e_3t2 /c2dtd

= log2 ir + log2 3 (4.3)

The differential information for the Rényi in-formation is zero.

Rf = R1 — [R + R] = 0 (4.4)

Again, the Wigner distribution provides no ad-dition information about the signal above andbeyond that provided by the marginals.

4.1 Wigner distribution for two Gaborlogons

A signal containing two Gabor logons sepa-rated in time by t0 is defined as:

s(t) =

148 / SPIE Vol. 1566 Advanced Signal Processing Algorithms, Architectures, and Implementations /1(199 1)

= [e(tt0/2)21F242

+e_(tto/2)2h/2a2]

The instantaneous power is:

p(t) = s2(t)1

information is:

' P00

= _log2jp3(t)dt1 P00

= _log2J(ircr2)l

e3t2 '2dt

log2ir log2 3=2

+4 +log2cr (4.1)

For the frequency marginal the information is:

' P00

= —1og2 I P3(w)dwJ —00

' P00= - log2

(4.2)

distri-

(4.5)

v'(1 + e_t/4c2)[e(t+t01'2)2'2 +

+2e(t2+t4)k2] (4.6)

The spectrum of s(t) is:

2/ —w2i2/2S(w)= e/v/f' + e_t42cos(wto/2) (4.7)

Power in the frequency domain is:

2oP(w) = S2(w) = e'

./F(1 + e_t/472)

cos2(wto/2) (4.8)

The Wigner distribution for this signal is:

1W(t,w) =

2ir(1 + e_t/42)

[e(t+to/2)2 /2 eC2U2

+etto/2)2k2 e''2°2

+2et22

cos(wto)] (4.9)

4.2 Rényi information for two Gabor lo.-gons

' P00= ——log2 / p3(t)dt2 J-oo

' P00

= —log2JP3(w)dw1

+ e_t/4

Page 6: Uncertainty, information and time-frequency distributionshero/Preprints/Uncertainty.pdf · The uncertainty principle is often incorrectly interpreted to mean that ... terizing the

= _iog2JJW3(t, w)dtdw

5. INFORMATION RESULTS

5.1 Two Gabor logonsTwo Gabor logons with unity variance as

previously described were separated in time ac-cording to units of time standard deviation andin frequency according to units of frequencystandard deviation. Various combinations oftime and frequency separation were used. Fig-ure 2 compares the results of separating twoGabor logons in time in terms of the Rényithird order information measure and the time-bandwidth product of the two Gabor logons.One can see that the both the information andthe TBP remain near zero until about two unitsof separation. At that point both measures be-gin to increase. However, the information mea-sure levels off at one bit above the zero sep-aration value. The information gain is thusone bit. This is quite appropriate for the twologons. The TBP continues to increase withincreasing separation, giving no indication ofthe number of signal entities present. Betweentime separation values of zero and three, TBPand information follow a very similar curve ifscale and offset are taken into account. Thisimplies that both measures proving something

useful concerning the resolvability of the twologons. However, TBP does not provide any

(4 10"indication of the time separation at which the' . I two logons are essentially resolved. The infor-mation measure shows that this occurs at aseparation of about six units. Equivalent re-sults are obtained if the separation is changedto frequency separation of the logons. If corn-binations of time and frequency separation areused and the Wigner distribution is derived forthe resulting signal, it is found that the Rényiinformation measure performs well in assessingthe situation. If time or frequency separationare applied singularly the results are very sirn-ilar to the marginal results. If time and fre-quency separations are applied together thereis an interesting overestimation of informationfor a combination of two units of separation forboth time and frequency. In this case the infor-mation gain was considered. The informationobtained for a single logon was subtracted fromthe result. Thus the information for zero sepa-ration of two logons is zero.The maximum valueobserved was 1.35 bits. However, with greaterseparation the information value approachs onebit as it should. These results are shown inFigure 3a. Figure 3b shows the WD of the twologons at the separations of maximum informa-tion overestimation.

The same analysis was carried out usingRID instead of the WD to determine the time-frequency distribution. Similar results were ob-tamed. However, the RID does not producenearly as much overestimation of the informa-tion as does the WD. In fact, the maximumoverestimation was 1.08 bits. These results areshown in Figure 4. Note that the informationhas a residual value above zero for zero sepa-ration in this case. That is because the gainwas obtained using the WD result for one lo-gon. This was done to show that the RID hasslightly more spread for a logon than does theWD. If the information gain had been deter-

SPIE Vol. 1566 Advanced Signal Processing Algorithms, Architectures, and Implementations 11(1991) / 149

The formulation becomes quite complex fortwo Gabor logons under Rényi Information.Results for the information measures were ob-tained using numerical techniques in practice.

An information measure based on Rényi In-formation of the third order was applied toWigner Distribution (WD) and Reduced Inter-ference Distribution (RID) time-frequency dis-tributions of two Gabor logons and four Gaborlogons separated by various amounts in timeand frequency.

Page 7: Uncertainty, information and time-frequency distributionshero/Preprints/Uncertainty.pdf · The uncertainty principle is often incorrectly interpreted to mean that ... terizing the

mined using the RID result for one logon theoverestimation of information under the RIDwould be less.

Figures 3b and 4b may be compared to gainsome insight into the cause of information over-estimation. The larger WD interference termsmay be the reason for this.

5.2 Four Gabor logonsA more ambitious application of these ideas

was then carried out. Four Gabor logons wereplaced at the corners of a rectangle in time-frequency. One side of the rectangle was timeseparation and the orthogonal side was fre-quency separation. When the length of bothsides was set to zero the four logons com-pletely overlapped forming one logon. Thisshould yield zero information gain. By increas-ing the length of the sides of the rectangle itwas possible to explore the effects of differenttypes of separations. When the time separationwas zero, but the frequency separation was in-creased the effect was that of two logons sep-arating in frequency. Likewise when frequencyseparation was zero and time separation wasincreased the effect was that of two logons sep-arating in time. Finally, when both time andfrequency separation was applied the four lo-gons separated as at the corners of a rectangleexpanding in size. The results for the WD andthe RID are shown in Figure 5. The resultsare as one might expect. Separation along ei-ther the time or frequency dimension producesa rise to one bit as was the case in the two lo-gon experiment. Separation in both time andfrequency produced a rise to one bit continu-ing with a rise to two bits with a small plateaufor some combinations of time and frequencyseparation. Again, the WD produced an over-estimation of information in some cases and theRID produced a smaller overestimation of infor-mation. Again also, since the RID informationgain was based on the WD result for one logon,

there was a small amount of non-zero informa-tion at zero separation in time and frequency.

6. INFORMATION INVARIANTTIME-FREQUENCY

DISTRIBUTIONSHere we introduce a new requirement for

time-frequency distributions . Information in-variant time-frequency distributions are definedto be time-frequency distributions which pos-sess an invariant information measure undertime and frequency shift and time and fre-quency scaling. This is a useful property be-cause it allows the information measure to beemployed usefully in several ways. For exam-ple,

. The information could be minimized underselection of kernel parameters, providingthe highest resolution or optimum result.

. The information measure provides anobjective assessment of the number ofequal energy resolvable elementary signalspresent.

. The information measure may provide ameans of judging the effectiveness of 5ev-eral different distributions.

Possession of a product kernel (/(O,T) =/'(OT)) by a distribution is sufficient for it tobe information invariant.The WD and the RIDenjoy this property among others. However,many interesting time-frequency distributionsdo not. The spectrogram, for example doesnot ordinarily preserve the volume in time-frequency under time scaling of the signal. Thisis because of interactions with the window. Itwould be possible to design a special windowwhich adjusts, perhaps, to account for this, butit is probably not useful.

150 / SPIE Vol. 1566 Advanced Signal Processing Algorithms, Architectures, and Implementations /1(199 1)

Page 8: Uncertainty, information and time-frequency distributionshero/Preprints/Uncertainty.pdf · The uncertainty principle is often incorrectly interpreted to mean that ... terizing the

Use of the information measure to opti-mize kernels may not produce a unique choice.There may be several choices which produce lo-cal minima in the information.

It is interesting to note that the analyzingwavelet of the wavelet transform is informa-tion invariant in time-frequency under the def-initions of this paper. This may be useful incertain applications of wavelet transforms.

Despite some of the problems the informa-tion concept may be useful if applied with care.One possible application is in instantaneous fre-quency applications.

6.1 Instantaneous frequencyInstantaneous frequency is an important as-

pect of time-frequency studies . Boashash hasprovided many useful insights into matters ofinstantaneous frequency . Instantaneous fre-quency may be defined to be the mean of fre-quency under the time-frequency distributionconditioned on time. The spread of instanta-neous frequency is based on the variance of theinstantaneous frequency about that value. Co-hen has provided many interesting insights oninstantaneous frequency, its spread and instan-taneous bandwidth '. In the same sense asCohen has proposed one may formulate an un-certainty measure for instantaneous frequency.It is:

R3(t) = —.log2

(i: ('' )).dw)

(6.1)

This is an alternative to Cohen's local band-width idea. It indicates the number of resolv-able entities in frequency for each time. Ide-ally it should be zero, indicating one compo-nent which is very narrow in frequency. Then,

P00— I R3(t)s2(t)dtILave —

J—00 (6.2)

The selection of kernels for instantaneous fre-quency representation8 may be aided by thisform of the information measure.

7. CONCLUSIONSThe new information measure derived in this

paper may be a useful tool in time-frequencyanalysis. It has properties similar to the time-bandwidth product in that it indicates the re-quired separation to resolve two elementary sig-nals. In addition it indicates the separation re-quired to completely resolve multiple elemen-tary signals. It may be useful in assessingcandidates for the elementary signal role aswell. The terms mono-component and multi-component are used in time-frequency analysis.Perhaps a mono-component signal should onlybe considered to be the one with the minimumuncertainty under an information measure suchas described here. The new measure may be auseful tool in designing kernels and comparingdistributions. Since complicated signals may bederived from assemblages of elementary signalsthis new measure should also provide useful in-formation concerning the resolution of comi 1-nations of these complicated signals. Rényi in-formation of the third order was utilized in thisstudy. It may be profitable to consider otherorders as well. Second order Rényi informa-tion would count the cross terms as well as theauto terms. This may serve as an index of crossterm interference. Few objective measures ex-ist to answer these questions. It is worthwhileto explore a variety of possibilities.

8. REFERENCES1. B. Boashash, P. O'Shea and M. J.

Arnold, "Algorithms for InstantaneousFrequency Estimation," SPIE AdvancedSignal-Processing Algorithms, A rchitecturesand Implementations, vol. 1348, pp. 126-148,1990.

SPIE Vol. 1566 Advanced Signal Processing Algorithms, Architectures, and Implementations 11(1991) / 151

Page 9: Uncertainty, information and time-frequency distributionshero/Preprints/Uncertainty.pdf · The uncertainty principle is often incorrectly interpreted to mean that ... terizing the

2. H.I.Choi and W. J. Williams, "ImprovedTime-Frequency Representation of Multi-component Signals Using Exponential Ker-nels," IEEE Trans. Acoust., Speech, SignalProc., vol. ASSP-37, no. 6, pp. 862-871,1989.

3. L. Cohen, "Time-Frequency Distributions -A Review," Proc. IEEE, vol. 77, No. 7, pp.941-981, July 1989.

4. L. Cohen, "Distributions ConcentratedAlong the Instantaneous Frequency," SPIE,Advanced Signal-Processing Algorithms,Architectures and Implementations, vol.1348, pp. 149-157,1990.

5. J. M. Combes, A. Grossmannand P. Tchacmitchian (Eds.), Wavelets,Time-Frequency Methods and Phase Space,Springer-Verlag, Berlin, 1989.

6. I. Daubchies, "Time-Frequency Localiza-tion Operators: A Geometric Phase SpaceApproach," IEEE Trans. on InformationTheory, vol. 34, No. 4, pp. 605-612, 1990.

7. B. Friedlander and B. Porat, "Detection ofTransient Signals by the Gabor Represen-tation," IEEE Trans. Acoustics, Speech andSignal Proc., ASSP, vol. 37, No. 2, pp. 169-179, 1989.

8. J. Jeong, G.S. Cunningham and W.J.Williams, "Instantaneous Frequency andKernel Requirements for Discrete Time-Frequency Distributions ," SPIE Advanced

Signal-Processing Algorithms, Architecturesand Implementations, vol. 134&, pp. 149-157,1990.

9. J. Jeong and W.J. Williams, "Kernel Designfor Reduced Interference Distributions," ac-cepted for publication (in press, Feb. 1992)IEEE Trans. Signal Proc.

10. D. Gabor, "Theory of Communication," J.of lEE (London), vol. 93, part III, No. 26,pp. 429-459, 1991.

11. T. W. Parks and R. G. Shenoy, "Time-Frequency Concentrated Basis Functions,"Proc. IEEE Intl. Conf. on Acoustics Speechand Signal Proc., vol. 5, pp. 2459-2462,1990.

12. A. Rényi, Probability Theory. Elesevier,Amsterdam, 1970

13. C. E. Shannon, and W. Weaver, The Math-ematical Theory of Communication, Uni. ofChicago Press, Chicago, 1949.

14. M. I. Skolnik,Introduction to radar systems2nd Ed., McGraw-Hill, New York, 1980.

15. D. Slepian and H. 0. Pollack, "ProlateSpheroidal Wave Functions, Fourier Anal-ysis and Uncertainty-I," Bell Systems Tech-nical Journal, vol. 40, pp.43-63, 1961.

16. H. J. Landau and H. 0. Pollack, "ProlateSpheroidal Wave Functions, Fourier Analy-sis and Uncertainty-Il," Bell Systems Tech-nical Journal, vol. 40, pp.€i5-84, 1961.

17. W.J. Williams and J. Jeong, "New Time-Frequency Distributions: Theory and Ap-plications," IEEE Int. Symp. Circuits andSystems, pp. 1243-1247, 1989.

18. W.J. Williams and J. Jeong, "New Time-Frequency Distributions for the Analysis ofMulticomponent Signals," SPIE, AdvancedAlgorithms and Architectures for SignalProcessing V, vol. 1152, pp. 483-495, 1990.

19. W.J. Williams and J. Jeong, "Reduced In-terference Time-Frequency Distributions,"to appear in Time-Frequency Signal Anal-ysis: Methods and Applications, ed. B.Boashash, Longman and Cheshire, 1991.

152 / SPIE Vol. 1566 Advanced Signal Processing Algorithms, Architectures, and Implementations 11(1991)

Page 10: Uncertainty, information and time-frequency distributionshero/Preprints/Uncertainty.pdf · The uncertainty principle is often incorrectly interpreted to mean that ... terizing the

3.4443.3443.2443.1443.044

Bits 2.9442.8442.7442.6442.5442.444

Figure l.b. Gabor logon frequencymarginal

SPIE Vol. 1566 Advanced Signal Processing Algorithms, Architectures, and Implementations II (1991) / 153

0.6

0.5

0.4

s(t) 0.3

0.2

0.1

0

1.2

1

0.8

S(w) 0.6

0.4

0.2

0.10 -8 -6 -4 -2 0 2 4 6 8 10

Time(Standard Deviations)

Figure l.a. Gabor logon time marginal

.3 -2 -1 0 1 2 3

Frequency(Standard Deviations)

0 Information

-X- TBP

p.S

3

2.5

2

1.5

1

0.5

TBP

Figure l.c. Gabor logon WD

012345678Time Separation of Logons

(Standard Deviations)

Figure 2. Information and time-bandwidthproduct (TBP) as a function of logonseparation in time standard deviationsfor two logons.

Page 11: Uncertainty, information and time-frequency distributionshero/Preprints/Uncertainty.pdf · The uncertainty principle is often incorrectly interpreted to mean that ... terizing the

Figure 3.a. Information in the WD as a function of time separation and frequency separation instandard deviations for two logons.

El

Figure 3.b. WD result for the most over-estimated amount of information (1.35 bits).

154 / SPIE Vol. 1566 Advanced Signal Processing Algorithms, Architectures, and Implementations 11(1991)

Page 12: Uncertainty, information and time-frequency distributionshero/Preprints/Uncertainty.pdf · The uncertainty principle is often incorrectly interpreted to mean that ... terizing the

PC

D

Co

'-1 i

CD

CD

ooP

0

CD

Co

C

OC

o p —

c+

0 0 CD

CD

0

Li

-.e

.

Co

I

Page 13: Uncertainty, information and time-frequency distributionshero/Preprints/Uncertainty.pdf · The uncertainty principle is often incorrectly interpreted to mean that ... terizing the

0.8

A.

WD

RIDB.

Figure 5. Information for four logons placed at the corners of a tt x Lw rectangle, where Lt andare in units of standard deviation. a. WD result. b. RID result.

156 / SPIE Vol. 1566 Advanced Signal Processing Algorithms, Architectures, and Implementations II (1991)