Sale!

MTHE – MATH 474/874 -Information Theory Homework # 5 solved

Original price was: $35.00.Current price is: $30.00. $25.50

Description

5/5 - (5 votes)

(1) Consider the binary Polya contagion Markov source treated in Example 3.17 in the textbook with memory M = 2. We are interested in sending this source over the memoryless
binary symmetric erasure channel (BSEC) with crossover probability ε and erasure probability α using rate-Rsc block source-channel codes.
(a) Write down the sufficient condition for reliable transmissibility of the source over
the BSEC via rate-Rsc source-channel codes in terms of ε, α, Rsc and the source
parameters ρ := R/T and δ := ∆/T.
(b) If ρ = δ = 1/2 and ε = α = 0.1, determine the permissible range of rates Rsc for
reliably communicating the source over the channel.
(2) Answer the following questions.
(a) Let X be a log-normal random variable with parameters µ ∈ R and σ > 0 and pdf
given by
fX(x) = 1
σx√

e

(ln x−µ)
2
2σ2
, x > 0.
Determine its differential entropy (in nats).
(b) Show that among all continuous random variables X admitting a pdf with support
(0,∞) and finite differential entropy and satisfying E[ln(X)] = µ and E

(ln(X) −
µ)
2

= σ
2
, where µ ∈ R and σ > 0 are fixed parameters, the log-normal random
variable with parameters µ and σ maximizes differential entropy.
1
(3) R´enyi information measures: Given two pdfs f and g with common support S ⊆ R, consider
the following R´enyi information measures (in nats) of order α, for α > 0, α 6= 1.
R´enyi differential entropy of f: hα(f) := 1
1−α
ln R
S
f(x)
α dx
.
R´enyi divergence between f and g: Dα(fkg) := 1
α−1
ln R
S
f(x)
α
g(x)
1−α dx
.
(a) Determine each of the above quantities for the Gaussian densities f ∼ N (0, σ2
1
) and
g ∼ N (0, σ2
2
).
(b) Find the limits as α → 1 of the measures obtained in (a) and comment qualitatively.
(4) Answer the following questions.
(a) Integral analogue of the log-sum inequality: Given non-negative functions a(·) and
b(·) on R
n
, show that
Z
a(x
n
) ln a(x
n
)
b(x
n)
dxn ≥ a ln a
b
where a := R
a(x
n
) dxn and b := R
b(x
n
) dxn and all integrals are assumed to exist.
(b) Data processing inequality for the divergence: Consider a memoryless continuous
channel with real input and output alphabets X = Y = R and transition pdf
fY |X with support R
2
. Let random variables X1 and X2, having common support
R and respective pdfs fX1 and fX2
, be two possible inputs to the channel, with
corresponding channel outputs Y1 and Y2, respectively. Show that
D(X1kX2) ≥ D(Y1kY2).
(5) Determine (with justification) whether each of the following statements is True or False:
(a) If X and Y are real-valued independent random variables, then
h(−4X − 3Y − 10) ≥ h(X) + 2 (in bits).
(b) Suppose that random variables X, Y and Z are jointly Gaussian, each with mean 0
and variance 1. Assume that X → Y → Z and that E[XY ] = ρ, where 0 < ρ < 1.
Then
I(X;Z) >
1
2
log2

1
1 − ρ
2

.
(c) Consider a network of two parallel memoryless Gaussian channels:



Y1 = X1 + Z1
Y2 = X2 + Z2
under an overall power constraint E[X2
1
] + E[X2
2
] ≤ P, where (X1, X2) and (Z1, Z2)
are independent and the noise variables Z1 and Z2 are zero-mean Gaussian with
covariance matrix
K =


σ
2 0
0 2σ
2

 , σ > 0.
If P > 3σ
2
, then the optimal input power allocation that maximizes the overall
channel capacity is
(P1, P2) = 
P + σ
2
2
,
P − σ
2
2

.
(6) Consider two zero-mean Gaussian random variables X1 and X2 with (positive) variances
P1 and P2, respectively: X1 ∼ N (0, P1) and X2 ∼ N (0, P2). Let
Y1 = X1 + Z
and
Y2 = X2 + Z
where Z ∼ N (0, σ2
) is a zero-mean Gaussian random variable with (positive) variance
σ
2 and is independent from both X1 and X2.
(a) Determine in nats D(X1kX2) and D(Y1kY2) in terms of P1, P2 and σ
2
.
(b) Compare D(X1kX2) to D(Y1kY2) and comment qualitatively.
(7) Answer the following questions.
(i) Problem 5.19
(ii) [MATH 874 only] Problem 5.18.
3