SAR-C1: Elements of Information Theory Final Exam .fr

December 12th. Documents authorized,. Please use separate pages between Ex. 1 and Exs. (2,3,4),. Answer questions in the same order as they are listed,.
64KB taille 3 téléchargements 317 vues
SAR-C1: Elements of Information Theory Final Exam Pablo Piantanida and Olivier Rioul December 12th

Documents authorized, Please use separate pages between Ex. 1 and Exs. (2,3,4), Answer questions in the same order as they are listed, Do not copy the questions, turn in the answers only, Write clearly, and show your reasoning with mathematical rigor. All these statements are important for your score.

1

Bounding discrete entropy

In this problem, we provide a method to find a continuous random variable (RV) through a discrete RV, which allows us to make the connection between discrete and differential entropy in a different way than from the lectures. Z taking integer values with mean µ = P Consider a discrete RV 2X ∈P 2 xp(x) and variance σ = x∈Z (x − µ) p(x). Choose a continuous x∈Z RV U uniformly distributed over the interval [0, 1], independent of X, and define ¯ = X + U. X ¯ and show (a) Find the probability distribution p(¯ x) of the continuous RV X 1 2 2 that its variance is given by σX¯ = σ + 12 . ¯ equals to the (absolute) entropy (b) Show that the differential entropy of X of X: ¯ = H(X). H(X) (c) Proof the following bound on the entropy of the discrete RV X: H(X) ≤

1 2

log(2πe(σ 2 +

1 12 )).

(d) Application: if X follows a Binomial distribution of length n with probability p, prove that   1 H(X) ≤ log 2πenp(1 − p) + πe 6 . 2 1

(e) Show by using the convergence of the Binomial law to the Gauss law that this bound becomes tight when n → ∞.

2

Coding theorem for AWGN channels

Consider an additive white Gaussian noise (AWGN) memoryless channel {W : R → R} with W (y|x) = N(x, N ) and an input PD P(x) = N(0, P ), where N is the variance of the channel noise and P is the power of the inputs. n be the set of pair of sequences x = (x , . . . , x ) ∈ Xn For any γ > 0, let TXY 1 n and y = (y1 , . . . , yn ) ∈ Yn defined as     W n (y|x) 1 P n n n 1