how to solve gaussian interference channelghan/wpi/fanslides.pdfย ยท wpi, hku, 2019 fan cheng...
Post on 09-Jul-2020
4 Views
Preview:
TRANSCRIPT
How to Solve Gaussian Interference
Channel
WPI, HKU, 2019
Fan Cheng
Shanghai Jiao Tong University
chengfan@sjtu.edu.cn
โ ๐ + ๐ก๐
๐2
2๐๐ฅ2๐ ๐ฅ, ๐ก =
๐
๐๐ก๐(๐ฅ, ๐ก)
โ ๐ = โโซ ๐ฅlog๐ฅ d๐ฅ๐ = ๐ + ๐ก๐๐ โผ ๐ฉ(0,1)
โก A new mathematical theory on Gaussian distributionโก Its application on Gaussian interference channelโก History, progress, and future
โก History of โSuper-Hโ Theorem
โก Boltzmann equation, heat equation
โก Shannon Entropy Power Inequality
โก Complete Monotonicity Conjecture
โก How to Solve Gaussian Interference Channel
Outline
Fire and Civilization
DrillSteam engine
James WattsMyth: west and east
Independence of US
The Wealth of Nations
1776
Study of Heat
Heat transferโก The history begins with the work of Joseph
Fourier around 1807โก In a remarkable memoir, Fourier invented
both Heat equation and the method of Fourier analysis for its solution
๐
๐๐ก๐ ๐ฅ, ๐ก =
1
2
๐2
๐๐ฅ2๐(๐ฅ, ๐ก)
Information Age
๐๐ก โผ ๐ฉ(0, ๐ก)Gaussian Channel:
X and Z are mutually independent. The p.d.f of X is g(x)
๐๐ก is the convolution of X and ๐๐ก. ๐๐ก โ ๐ + ๐๐ก
The probability density function (p.d.f.) of ๐๐ก
๐(๐ฆ; ๐ก) = โซ ๐(๐ฅ)1
2๐๐ก๐(๐ฆโ๐ฅ)2
2๐ก
๐
๐๐ก๐(๐ฆ; ๐ก) =
1
2
๐2
๐๐ฆ2๐(๐ฆ; ๐ก)
The p.d.f. of Y is the solution to the heat equation, and vice versa.
Gaussian channel and heat equation are identical in mathematics.
A mathematical theory of communication,
Bell System Technical Journal.
Ludwig Boltzmann
Boltzmann formula:
Boltzmann equation:
H-theorem:
Ludwig Eduard Boltzmann
1844-1906
Vienna, Austrian Empire
๐ = โ๐๐ตln๐๐ = โ๐๐โ
๐๐๐ln๐๐
๐๐
๐๐ก= (
๐๐
๐๐ก)force + (
๐๐
๐๐ก)diff + (
๐๐
๐๐ก)coll
๐ป(๐(๐ก))is nonโdecreasing
Gibbs formula:
โก McKeanโs Problem on Boltzmann equation (1966): โก ๐ป(๐(๐ก)) is CM in ๐ก, when
๐ ๐ก satisfies Boltzmann equationโก False, disproved by E. Lieb in
1970sโก the particular Bobylev-Krook-Wu
explicit solutions, this โtheoremโ holds true for ๐ โค 101 and breaks downs afterwards
โSuper H-theoremโ for Boltzmann Equation
H. P. McKean, NYU.
National Academy of Sciences
A function is completely monotone (CM) iff all the signs of its derivatives
are alternating in +/-: +, -, +, -,โฆโฆ (e.g., 1/๐ก, ๐โ๐ก )
โก Heat equation: Is ๐ป(๐(๐ก)) CM in ๐ก, if ๐(๐ก) satisfies heat equation
โก Equivalently, is ๐ป(๐ + ๐ก๐) CM in t? โก The signs of the first two order derivatives were obtainedโก Failed to obtain the 3rd and 4th. (It is easy to compute the
derivatives, it is hard to obtain their signs)
โSuper H-theoremโ for Heat Equation
โThis suggests thatโฆโฆ, etc., but I could not prove itโ
-- H. P. McKean
C. Villani, 2010 Fields Medalist
Claude E. Shannon and EPI
โก Entropy power inequality (Shannon 1948): For any two independent continuous random variables X and Y,
Equality holds iff X and Y are Gaussianโก Motivation: Gaussian noise is the worst noiseโก Impact: A new characterization of Gaussian distribution in
information theoryโก Comments: most profound! (Kolmogorov)
๐2โ(๐ฟ+๐) โฅ ๐2โ(๐ฟ) + ๐2โ(๐)
Central limit theoremCapacity region of Gaussian broadcast channelCapacity region of Gaussian Multiple-Input Multiple-Output broadcast channelUncertainty principle
All of them can be proved by Entropy Power Inequality (EPI)
โก Shannon himself didnโt give a proof but an explanation, which turned out to be wrong
โก The first proof is given by A. J. Stam (1959), N. M. Blachman (1966)
โก Research on EPIGeneralization, new proof, new connection. E.g., Gaussian interference channel is
open, some stronger โEPIโโ should exist.
โก Stanford Information Theory School: Thomas Cover and his students: A. El Gamel, M. H. Costa, A. Dembo, A. Barron (1980-1990)
โก After 2000, Princeton && UC Berkeley
Entropy Power Inequality
Heart of Shannon theory
Ramification of EPI
Shannon EPI
Gaussian perturbation: โ(๐ + ๐ก๐)
Fisher Information: ๐ผ ๐ + ๐ก๐ =๐
๐๐กโ(๐ + ๐ก๐)/2
Fisher Information is decreasing in ๐ก
๐2โ(๐+ ๐ก๐) is concave in ๐กFisher information inequality (FII):
1
๐ผ(๐+๐)โฅ
1
๐ผ(๐)+
1
๐ผ(๐)
Tight Youngโs inequality
๐ + ๐ ๐ โฅ ๐ ๐ ๐ ๐ ๐
Status Quo: FII can imply EPI and all its generalizations.
Many network information problems remain open even
the noise is Gaussian.
--Only EPI is not sufficient
Where our journey begins Shannon Entropy power inequality
Fisher information inequality
โ(๐ + ๐ก๐)
โ ๐ ๐ก is CM
When ๐(๐ก) satisfied Boltzmann equation, disproved
When ๐(๐ก) satisfied heat equation, unknown
We even donโt know what CM is!
Mathematician ignored it
Raymond introduced this paper to me in 2008
I made some progress with Chandra Nair in 2011 (MGL)
Complete monotonicity (CM) was discovered in 2012
The third derivative in 2013 (Key breakthrough)
The fourth order in 2014
Recently, CM GIC
Motivation
Motivation: to find some inequalities to obtain a better rate region; e.g., the
convexity of ๐(๐ฟ + ๐โ๐๐), the concavity of ๐ฐ ๐ฟ+ ๐๐
๐, etc.
โAny progress?โ
โNopeโฆโ
It is widely believed that there should be no
new EPI except Shannon EPI and FII.
Observation: ๐ฐ(๐ฟ + ๐๐) is convex in ๐
๐ผ ๐ + ๐ก๐ =๐
2๐๐กโ ๐ + ๐ก๐ โฅ 0 (de Bruijn, 1958)
๐ผ(1) =๐
๐๐ก๐ผ ๐ + ๐ก๐ โค 0 (McKean1966, Costa 1985)
Could the third one be determined?
Discovery
Observation: ๐ฐ(๐ฟ + ๐๐) is convex in ๐
โ ๐ + ๐ก๐ =1
2ln 2๐๐๐ก, ๐ผ ๐ + ๐ก๐ =
1
๐ก. ๐ผ is CM: +, -, +, -โฆ
If the observation is true, the first three derivatives are: +, -, +
Q: Is the 4th order derivative -? Because ๐ is Gaussian! If so, thenโฆ
The signs of derivatives of โ(๐ + ๐ก๐) are independent of ๐. Invariant!
Exactly the same problem in McKeanโs 1966 paper
To convince people, must prove its convexity
My own opinion:
โข A new fundamental result on Gaussian distribution
โข Invariant is very important in mathematics
โข In mathematics, the more beautiful, the more powerful
โข Very hard to make any progress
Challenge
Let ๐ โผ ๐(๐ฅ)
โ ๐๐ก = โโซ ๐(๐ฆ, ๐ก) ln ๐(๐ฆ, ๐ก) ๐๐ฆ: no closed-form expression
except for some special ๐ ๐ฅ . ๐(๐ฆ, ๐ก) satisfies heat equation.
๐ผ ๐๐ก = โซ๐12
๐๐๐ฆ
๐ผ 1 ๐๐ก = โโซ๐2
๐โ
๐12
๐2
2
๐๐ฆ
So what is ๐ผ(2)? (Heat equation, integration by parts)
Challenge (contโd)
It is trivial to calculate derivatives.
It is not generally obvious to prove their signs.
๐ฐ
Breakthrough
Integration by parts: โซ ๐ข๐๐ฃ = ๐ข๐ฃ โ โซ ๐ฃ๐๐ข
First breakthrough since
McKean 1966
GCMCGaussian complete monotonicity conjecture (GCMC):
๐ฐ(๐ฟ + ๐๐) is CM in ๐
A general form: number partition. Hard to determine the coefficients.
Conjecture 2: ๐ฅ๐จ๐ ๐ฐ(๐ฟ + ๐๐) is convex in ๐
Hard to find ๐ฝ๐,๐ !
Moreover
C. Villani showed the work of H. P. McKean to us.
G. Toscani cited our work within two weeks:
the consequences of the evolution of the entropy and of its subsequent
derivatives along the solution to the heat equation have important consequences.
Indeed the argument of McKean about the signs of the first two derivatives are
equivalent to the proof of the logarithmic Sobolev inequality.
Gaussian optimality for derivatives of differential entropy using linear matrix inequalities
X. Zhang, V. Anantharam, Y. Geng - Entropy, 2018 - mdpi.com
โข A new method to prove signs by LMI
โข Verified the first four derivatives
โข For the fifth order derivative, current methods cannot find a solution
Complete monotone function
Herbert R. Stahl, 2013
๐ ๐ก = เถฑ0
โ
๐โ๐ก๐ฅ ๐๐(๐ฅ)
A new expression for entropy involved special
functions in mathematical physics
How to construct ๐(๐ฅ)?
๐
Complete monotone function
Theorem: A function ๐(๐ก) is CM in ๐ก, then log ๐(๐ก) is also convex in ๐ก ๐ผ ๐๐ก is CM in ๐ก, then log ๐ผ(๐๐ก) is convex in ๐ก (Conjecture 1 implies
Conjecture 2)
A function f(t) is CM, a Schur-convex function can be obtained by f(t) Schur-convex โ Majority theory
Remarks: The current tools in information
theory donโt work. More sophisticated tools
should be built to attack this problem.
A new mathematical foundation of
information theory
1946
True Vs. False
If GCMC is true A fundamental breakthrough in mathematical physics, information
theory and any disciplines related to Gaussian distribution A new expression for Fisher information Derivatives are an invariant
Though โ(๐ + ๐ก๐) looks very messy, certain regularity exists Application: Gaussian interference channel?
If GCMC is false No Failure, as heat equation is a physical phenomenon A Gauss constant (e.g. 2019), where Gaussian distribution fails. Painful!
Complete Monotonicity:
How to Solve Gaussian Interference Channel
Two fundamental channel coding problem: BC and GIC
โ ๐๐1 + ๐๐2 + ๐1 , โ ๐๐1 + ๐๐2 + ๐2 exceed the
capability of EPI
Han-Kobayashi inner bound
Many researchers have contributed to this model
Foundation of wireless communication
The Thick Shell over โ(๐ + ๐ก๐)
โ(๐ + ๐ก๐) is hard to estimate:
The p.d.f of ๐ + ๐ก๐ is messy
๐ ๐ฅ log ๐(๐ฅ) โซ ๐ ๐ฅ log๐(๐ฅ)No generally useful lower or upper bounds
--The thick shell over ๐ + ๐ก๐
Analysis: alternating is the worst
If the CM property of โ(๐ + ๐ก๐) is not true Take 5 for example: if CM breaks down after n=5 If we just take the 5th derivative, there may be nothing special.
(So GIC wonโt be so hard) CM affected the rate region of GIC
Prof. Siu, Yum-Tong: โAlternating is the worst thing in analysis as the integral is hard to converge, though CM is very beautifulโ It is not strange that Gaussian distribution is the worst in
information theory
Common viewpoint: information theory is about information inequality: EPI, MGL, etc.
CM is a class of inequalities. We should regard it as a whole in application. We should pivot our viewpoint from inequalities.
Information Decomposition
The lesson learned from complete monotonicity
๐ผ ๐ + ๐ก๐ = เถฑ0
โ
๐โ๐ก๐ฅ๐๐(๐ฅ)
Two independent components: ๐โ๐ก๐ฅ stands for complete monotonicity
๐๐(๐ฅ) serves as the identity of ๐ผ ๐ + ๐ก๐ Information decomposition:
Fisher Information = Complete Monotonicity + Borel Measure
CM is the thick shell. It can be used to estimate in majority theory Very useful in analysis and geometry
๐๐(๐ฅ) involves only ๐ฅ, and ๐ก is removed The thick shell is removed from Fisher information ๐๐(๐ฅ) is relatively easier to study than Fisher information WE know very little about ๐๐(๐ฅ)
Only CM is useless for (network) information theory The current constraints on ๐๐(๐ฅ) are too loose Only the โspecial oneโ is useful, otherwise every CM function should
have the same meaning in information theory
CM && GIC
A fundamental problem should have a nice and clean solution.
To understand complete monotonicity is not an easy job (10 years).
Top players are ready, but the football is missingโฆ
Thanks!
Guangyue
Raymond, Chandra, Venkat, Vincent...
top related