nearly optimal non-gaussian codes for the gaussian

Post on 23-Feb-2022

14 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Alex Dytso, Daniela Tuninetti, Natasha Devroye

Nearly Optimal Non-Gaussian Codes for the Gaussian Interference Channel

Channel Model

Symmetric:

Z1

Z2

W1 W1

W2W2

h11

h21

h12

h22Dec.

Dec.Enc.

Enc.

Xn1

Xn2 Y n

2

Y n1

|h11|2 = |h22|2 = S|h12|2 = |h21|2 = I

with the usual assumptions and capacity region definition.

Motivation

Treat Interference as Noise Inner Bound:RTIN+TS

in = co

✓SPX1X2=PX1PX2

⇢0 R1 I(X1;Y1)

0 R2 I(X2;Y2)

�◆

RTINnoTS

in

=S

PX1X2=PX1PX2

⇢0 R

1

I(X1

;Y1

)0 R

2

I(X2

;Y2

)

With Time Sharing

No Time Sharing

How far away is TINnoTS from the Capacity?

Capacity: C = limn!1 co

✓SPXn

1 Xn2

=PXn1

PXn2

⇢0 R1 1

nI(Xn1 ;Y n

1 )

0 R2 1nI(Xn

2 ;Y n2 )

�◆

R. Ahlswede, “Multi-way communication channels,” in Proc. IEEE Int. Symp. Inf. Theory, March 1973, pp. 23–52.

Outline

• Definitions and relevant past work

• Useful tools:

- Mutual information lower bounds

- Minimum Distance lower bounds

• Main result

Mixed InputsX =

p1� � XD +

p� XG,

� 2 [0, 1],

XD ⇠ PAM(N) ,

XG ⇠ N (0, 1)

x

PAM

X =p1� �XD1 +

p�XD2

XD1 ⇠ PAM(N1), XD2 ⇠ PAM(N2)

Capacity Strong Interference|h11|2 |h21|2 and |h22|2 |h12|2 Sato, H., "The capacity of the Gaussian interference channel under strong

interference," IEEE Trans. Inf. Theory, vol. 27, no. 6, pp. 786,788, Nov 1981.

Sum-Capacity in Very Weak Interferencer

SI

(1 + I) 12

X. Shang, G. Kramer, and B. Chen, “A new outer bound and the noisy-interference sum rate capacity for Gaussian interference channels,” IEEE Trans. Inf. Theory, vol. 55, no. 2, pp. 689–699, 2009.

Motahari, A.S.; Khandani, A.K., "Capacity Bounds for the Gaussian Interference Channel," IEEE Trans. Inf. Theory, vol.55, no.2, pp.620,643, Feb. 2009

Annapureddy, V.S.; Veeravalli, V.V., "Gaussian Interference Networks: Sum Capacity in the Low-Interference Regime and New Outer Bounds on the Capacity Region," IEEE Trans. Inf. Theory, vol.55, no.7, pp.3032,3050, July 2009

Known Results

Approximate Capacity

Universal Gap

R1

R2

RI

RO

Avestimehr, A. Salman, Suhas N. Diggavi, Chao Tian, and N. C. David Tse. "An Approximation Approach to Network Information Theory."

Approximate CapacityHK+Gaussian Inputs

1/2 bitR. Etkin, D. Tse, and H. Wang, “Gaussian interference channel capacity to within one bit,” IEEE Trans. Inf. Theory, vol. 54, no. 12, pp. 5534–5562, Dec. 2008.

Encoder 1

Encoder 2 Decoder 2

Decoder 1

p(y1, y2|x1, x2)

Y n1

Y n2

Xn1

Xn2

(W1c, W1p)

(W2c, W2p)

(W1c, W1p, W2c)

(W2c, W2p, W1c)

A. Dytso, D. Tuninetti, and N. Devroye, “On the two-user interference channel with lack of knowledge of the interference codebook at one receiver,” IEEE Trans. Inf. Theory, vol. 61, no. 3, pp. 1257–1276, March 2015.

Encoder 1

Encoder 2 Decoder 2

Decoder 1

p(y1, y2|x1, x2)

Y n1

Y n2

Xn1

Xn2

(W1c, W1p)

(W2, W1c)

(W1c, W1p)

W2

“One-sided” HK+ Mixed Inputs

1.75 bits

R. Etkin, D. Tse, and H. Wang, “Gaussian interference channel capacity to within one bit,” IEEE Trans. Inf. Theory, vol. 54, no. 12, pp. 5534–5562, Dec. 2008.

D(↵) :=

8<

:(d1, d2) 2 R2+ : di := lim

I = S↵,S ! 1

Ri12 log(1 + S)

, i 2 [1 : 2], (R1, R2) is achievable

9=

; .

d1

d2

1

11 ↵ 2

d1 + d2 = �

d1

d2

1

2/3 � 1 1

d1 + 2d2 = 2

d1 + d2 = 2� �

2d1 + d2 = 2

gDoF Region

Sum-gDoF

1 2↵

2

15/3

2/31/2

dW (�)

TIN�G

RTINnoTS

in

=S

PX1X2=PX1PX2

⇢0 R

1

I(X1

;Y1

)0 R

2

I(X2

;Y2

)

Achievability

RTINnoTS

in

=[

N1,N2,�1,�2

⇢0 R

1

I(X1

;Y1

)0 R

2

I(X2

;Y2

)

�with

Xi =p1� �i XiD +

p�i XiG,

�i 2 [0, 1],

XiD ⇠ PAM(Ni) ,

XiG ⇠ N (0, 1),

i = 1, 2.

I(X2;Y2) = I(X2;h21X1 + h22X2 + ZG)

= I

X1D, X2D;

p1� �1h21X1D +

p1� �2h22X2Dp

1 + |h21|2�1 + |h22|2�2

+ ZG

!

� I

X1D;

p1� �1p

1 + |h21|2�1

h21X1D + ZG

!

+

1

2

log

�1 + |h21|2�1 + |h22|2�2

�� 1

2

log(1 + |h21|2�1).

Analytical rate expressions not trivial

Bounds

I(XD;

pSXD + Z) � H(XD)� gap,

gap = ⇠ log

1

⇠+ (1� ⇠) log

1

1� ⇠+ ⇠ log(N � 1) ,

⇠ := 2Q

pSdmin(XD)

2

!,

Ozarow-Wyner-AI(XD;

pSXD + Z) � H(XD)� gap,

gap =

12 log

�⇡e6

�+

12 log

✓1 +

12Sd2

min(XD)

◆.

Ozarow-Wyner-B

DTD-ITA`14-AI(XD;

pSXD + Z)

2

4� log

0

@X

(i,j)2[1:N ]2

pipj1p4⇡

e

� (si�sj)2

4

1

A� 1

2

log (2⇡e)

3

5+

I(XD;

pSXD + Z) � log(N)� gap,

gap =

1

2

log

⇣e

2

⌘+

1

2

log

1 + (N � 1)e

�Sd2

min(XD)4

!.

DTD-ITA`14-B

Dytso, A.; Tuninetti, D.; Devroye, N., "On discrete alphabets for the two-user Gaussian interference channel with one receiver lacking knowledge of the interfering codebook," ITA, 2014 , vol., no., pp.1,8, 9-14 Feb. 2014

Bounds

I(XD;

pSXD + Z) � H(XD)� gap,

gap = ⇠ log

1

⇠+ (1� ⇠) log

1

1� ⇠+ ⇠ log(N � 1) ,

⇠ := 2Q

pSdmin(XD)

2

!,

Ozarow-Wyner-AI(XD;

pSXD + Z) � H(XD)� gap,

gap =

12 log

�⇡e6

�+

12 log

✓1 +

12Sd2

min(XD)

◆.

Ozarow-Wyner-B

DTD-ITA`14-AI(XD;

pSXD + Z)

2

4� log

0

@X

(i,j)2[1:N ]2

pipj1p4⇡

e

� (si�sj)2

4

1

A� 1

2

log (2⇡e)

3

5+

I(XD;

pSXD + Z) � log(N)� gap,

gap =

1

2

log

⇣e

2

⌘+

1

2

log

1 + (N � 1)e

�Sd2

min(XD)4

!.

DTD-ITA`14-B

Dytso, A.; Tuninetti, D.; Devroye, N., "On discrete alphabets for the two-user Gaussian interference channel with one receiver lacking knowledge of the interfering codebook," ITA, 2014 , vol., no., pp.1,8, 9-14 Feb. 2014

Number of Points

SNRdB0 10 20 30 40 50 60 70 80 90 100 110

0

0.2

0.4

0.6

0.8

1

1.2

1.4

1.6

1.8

2Ozarow-Wyner-BOzarow-Wyner-ADTD-ITA14-ADTD-ITA14-Bshaping loss

SNRdB0 10 20 30 40 50 60 70 80 90 100 110

0

2

4

6

8

10

12

14

16

18

20

Ozarow-Wyner-BOzarow-Wyner-ADTD-ITA14-ADTD-ITA14-BCapacity

Bound ComparisonN = b

p1 + Sc

Minimum Distance

0 2 4 6 8 10 12 14 160

0.05

0.1

0.15

0.2

0.25

0.3

0.35

h1

d min

(h1X 1+h

2X 2)

Very Irregular

Example: h2=1, N1=N2=10

dmin(h1X1D+h2X2D)

I(XD;

pSXD + Z) � H(XD)� gap,

gap =

12 log

�⇡e6

�+

12 log

✓1 +

12Sd2

min(XD)

◆.

I(X1D, X2D;h1X1D + h2X2D + Z)

dmin(h1X1D+h2X2D) = min(h1dmin(X1D), h2dmin(X2D))

if either h2dmin(X2D)N2 h1dmin(X1D)

or h1dmin(X1D)N1 h2dmin(X2D)

dmindmin

h11X1

h11X1 + h12X2

0 2 4 6 8 10 12 14 160

0.05

0.1

0.15

0.2

0.25

0.3

0.35

d min(hxX+h

yY)

hx

Proposition 2 is not valid Proposition 2 is valid

No Overlaps

dmin: bound 1

0 2 4 6 8 10 12 14 160

0.05

0.1

0.15

0.2

0.25

0.3

0.35

d min(hxX+h

yY)

hx

Proposition 2 is not valid Proposition 2 is valid

Overlaps

dmin: bound 2

dmin

h11X1

h11X1 + h12X2

dmin(h1X1D+h2X2D) = ⇥ min

✓h1dmin(X1D), h2dmin(X2D),max

✓h1dmin(X1D)

N2,h2dmin(X2D)

N1

◆◆

⇥ =

(1 + log(max(N1, N2)))

for all (h1, h2) except an outage set of measure � for any � > 0.

Main Result Very Weak

Weak I Weak II Strong VeryStrong

0 1/2 2/3 1 2

gap = 1/2 gap = 3.79 gap = 1.25gap = O(log log(min(S, I))

up to an outage

of controllable measure

Gaussian Mixed Mixed Discrete DiscreteXi =

p1� �i XiD +

p�i XiG, i 2 [1 : 2],

Closed-form expressionsfor number of points,power splits and gap

↵ =

log I

log S

Example

R1

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 5.5

R 2

0

0.5

1

1.5

2

2.5

3

3.5

4

4.5

5

5.5Capacity RegionAchiev. Theorem 7Achiev. Ozarow-Wyner-BAchiev. DTD-ITA-AAchiev. Monte Carl

PAM+PAMXD := XD1 +XD2 ,

where XD1 ⇠ discrete : dmin(XD1 )> 0,

XD2 ⇠ discrete : dmin(XD2 )> 0,

XM := XD1 +XG,

where XD1 , XG and XD2 are mutually independent. Then, for ZG ⇠ N (0, 1)independent of everything else, we have

I(XD; gXD + ZG)� I(XM ; gXM + ZG) 1

2

log(2),

I(XM ; gXM + ZG)� I(XD; gXD + ZG) 1

2

log

⇣⇡e3

⌘+

1

2

log

1 +

12

g2 d2min(XD)

!.

Concluding Remarks

• Key idea: use non-Gaussian inputs

• Developed very general tools of use beyond 2-IC

• Applicable to Block Asynchronous Interference Channel and Codebook Oblivious Interference Channel

Thank youarXiv:1506.02597 odytso2@uic.edu

top related